You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@dolphinscheduler.apache.org by GitBox <gi...@apache.org> on 2021/12/07 08:04:39 UTC

[GitHub] [dolphinscheduler] zhongjiajie commented on a change in pull request #7214: [DS-7016][feat] Auto create workflow while import sql script with specific hint

zhongjiajie commented on a change in pull request #7214:
URL: https://github.com/apache/dolphinscheduler/pull/7214#discussion_r763719855



##########
File path: dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/controller/ProcessDefinitionController.java
##########
@@ -696,7 +696,12 @@ public Result queryAllProcessDefinitionByProjectCode(@ApiIgnore @RequestAttribut
     public Result importProcessDefinition(@ApiIgnore @RequestAttribute(value = Constants.SESSION_USER) User loginUser,
                                           @ApiParam(name = "projectCode", value = "PROJECT_CODE", required = true) @PathVariable long projectCode,
                                           @RequestParam("file") MultipartFile file) {
-        Map<String, Object> result = processDefinitionService.importProcessDefinition(loginUser, projectCode, file);
+        Map<String, Object> result;
+        if ("application/zip".equals(file.getContentType())) {

Review comment:
       Maybe we should add another bottom in web UI to do this? You could probably add another function to handle it. You could add some frontend code to add a button if you could, but if you do not know the frontend code, we would add it in another PR.
   
   But, using the old function `importProcessDefinition` is not a good idea, also well as using the constants named `application/zip`

##########
File path: dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/ProcessDefinitionServiceImpl.java
##########
@@ -865,6 +882,149 @@ public DagDataSchedule exportProcessDagData(ProcessDefinition processDefinition)
         return result;
     }
 
+    @Override
+    public Map<String, Object> importSqlProcessDefinition(User loginUser, long projectCode, MultipartFile file) {
+        Map<String, Object> result = new HashMap<>();
+        String processDefinitionName = file.getOriginalFilename() == null ? file.getName() : file.getOriginalFilename();
+        int index = processDefinitionName.lastIndexOf(".");
+        if (index > 0) {
+            processDefinitionName = processDefinitionName.substring(0, index);
+        }
+        // build process definition
+        Date now = new Date();
+        ProcessDefinition processDefinition = new ProcessDefinition();
+        processDefinition.setName(processDefinitionName);
+        processDefinition.setCreateTime(now);
+        processDefinition.setUpdateTime(now);
+        processDefinition.setFlag(Flag.YES);
+        processDefinition.setTenantId(-1);
+        processDefinition.setGlobalParamList(Collections.emptyList());
+
+        DagDataSchedule dagDataSchedule = new DagDataSchedule();
+        dagDataSchedule.setProcessDefinition(processDefinition);
+        List<TaskDefinition> taskDefinitionList = new ArrayList<>();
+        dagDataSchedule.setTaskDefinitionList(taskDefinitionList);
+        List<ProcessTaskRelation> processTaskRelationList = new ArrayList<>();
+        dagDataSchedule.setProcessTaskRelationList(processTaskRelationList);
+
+        // In most cases, there will be only one data source
+        Map<String, DataSource> dataSourceCache = new HashMap<>(1);
+        Map<String, Long> taskNameToCode = new HashMap<>(16);
+        Map<String, List<String>> taskNameToUpstream = new HashMap<>(16);
+        try (ZipInputStream zIn = new ZipInputStream(file.getInputStream());
+             BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(zIn))) {
+            ZipEntry entry;
+            while ((entry = zIn.getNextEntry()) != null) {

Review comment:
       @lenboo @CalvinKirs @caishunfeng Do you have time to take a look at this code?

##########
File path: dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/ProcessDefinitionServiceTest.java
##########
@@ -651,6 +666,47 @@ public void testBatchExportProcessDefinitionByCodes() {
         Assert.assertNotNull(processDefinitionService.exportProcessDagData(processDefinition));
     }
 
+    @Test
+    public void testImportSqlProcessDefinition() throws Exception {
+        int userId = 10;
+        User loginUser = Mockito.mock(User.class);
+        Mockito.when(loginUser.getId()).thenReturn(userId);
+        
+        ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
+        ZipOutputStream outputStream = new ZipOutputStream(byteArrayOutputStream);
+        outputStream.putNextEntry(new ZipEntry("import_sql/"));

Review comment:
       Nice unittest 👍 BTW

##########
File path: dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/ProcessDefinitionServiceImpl.java
##########
@@ -865,6 +882,149 @@ public DagDataSchedule exportProcessDagData(ProcessDefinition processDefinition)
         return result;
     }
 
+    @Override
+    public Map<String, Object> importSqlProcessDefinition(User loginUser, long projectCode, MultipartFile file) {
+        Map<String, Object> result = new HashMap<>();
+        String processDefinitionName = file.getOriginalFilename() == null ? file.getName() : file.getOriginalFilename();
+        int index = processDefinitionName.lastIndexOf(".");
+        if (index > 0) {
+            processDefinitionName = processDefinitionName.substring(0, index);
+        }
+        // build process definition
+        Date now = new Date();
+        ProcessDefinition processDefinition = new ProcessDefinition();
+        processDefinition.setName(processDefinitionName);
+        processDefinition.setCreateTime(now);
+        processDefinition.setUpdateTime(now);
+        processDefinition.setFlag(Flag.YES);
+        processDefinition.setTenantId(-1);
+        processDefinition.setGlobalParamList(Collections.emptyList());
+
+        DagDataSchedule dagDataSchedule = new DagDataSchedule();
+        dagDataSchedule.setProcessDefinition(processDefinition);
+        List<TaskDefinition> taskDefinitionList = new ArrayList<>();
+        dagDataSchedule.setTaskDefinitionList(taskDefinitionList);
+        List<ProcessTaskRelation> processTaskRelationList = new ArrayList<>();
+        dagDataSchedule.setProcessTaskRelationList(processTaskRelationList);
+
+        // In most cases, there will be only one data source

Review comment:
       I think we should support multiply data source, according user SQL hint.

##########
File path: dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/ProcessDefinitionServiceImpl.java
##########
@@ -865,6 +882,149 @@ public DagDataSchedule exportProcessDagData(ProcessDefinition processDefinition)
         return result;
     }
 
+    @Override
+    public Map<String, Object> importSqlProcessDefinition(User loginUser, long projectCode, MultipartFile file) {
+        Map<String, Object> result = new HashMap<>();
+        String processDefinitionName = file.getOriginalFilename() == null ? file.getName() : file.getOriginalFilename();
+        int index = processDefinitionName.lastIndexOf(".");
+        if (index > 0) {
+            processDefinitionName = processDefinitionName.substring(0, index);
+        }
+        // build process definition
+        Date now = new Date();
+        ProcessDefinition processDefinition = new ProcessDefinition();
+        processDefinition.setName(processDefinitionName);
+        processDefinition.setCreateTime(now);
+        processDefinition.setUpdateTime(now);
+        processDefinition.setFlag(Flag.YES);
+        processDefinition.setTenantId(-1);

Review comment:
       I thinks we should use `loginUser` tenant id, WDYT @lenboo 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@dolphinscheduler.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org