dataset_tasks.py 文件源码

python
阅读 15 收藏 0 点赞 0 评论 0

项目:kge-server 作者: vfrico 项目源码 文件源码
def on_post(self, req, resp, dataset_id, dataset_dto, gen_triples_param):
        """Generates a task to insert triples on dataset. Async petition.

        Reads from body the parameters such as SPARQL queries

        {"generate_triples":
            {
                "graph_pattern": "<SPARQL Query (Where part)>",
                "levels": 2,
                "batch_size": 30000   # Optional
            }
        }

        :param id dataset_id: The dataset to insert triples into
        :param DTO dataset_dto: The Dataset DTO from dataset_id (from hook)
        :param dict gen_triples_param: Params to call generate_triples function
                                       (from hook)
        """
        try:
            batch_size = gen_triples_param.pop("batch_size")
        except KeyError:
            batch_size = None

        # Launch async task
        task = async_tasks.generate_dataset_from_sparql.delay(
            dataset_id, gen_triples_param.pop("graph_pattern"),
            int(gen_triples_param.pop("levels")), batch_size=batch_size)

        # Create a new task
        task_dao = data_access.TaskDAO()
        task_obj, err = task_dao.add_task_by_uuid(task.id)
        if task_obj is None:
            raise falcon.HTTPNotFound(description=str(err))
        task_obj["next"] = "/datasets/" + dataset_id
        task_dao.update_task(task_obj)

        # Store the task into DatasetDTO
        dataset_dao = data_access.DatasetDAO()
        dataset_dao.set_task(dataset_id, task_obj['id'])

        msg = "Task {} created successfuly".format(task_obj['id'])
        textbody = {"status": 202, "message": msg, "task": task_dao.task}
        resp.location = "/tasks/" + str(task_obj['id'])
        resp.body = json.dumps(textbody)
        resp.content_type = 'application/json'
        resp.status = falcon.HTTP_202
评论列表
文章目录


问题


面经


文章

微信
公众号

扫码关注公众号