You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@dolphinscheduler.apache.org by "majingxuan123 (via GitHub)" <gi...@apache.org> on 2024/01/20 10:04:47 UTC
[I] when use datax task occer line 14: --jvm=-Xms1G -Xmx2G: command not found [dolphinscheduler]
majingxuan123 opened a new issue, #15515:
URL: https://github.com/apache/dolphinscheduler/issues/15515
### Search before asking
- [X] I had searched in the [issues](https://github.com/apache/dolphinscheduler/issues?q=is%3Aissue) and found no similar issues.
### What happened
from log file it maybe not work yet just envirement ?
but the web sit guide may be is wrong?
![image](https://github.com/apache/dolphinscheduler/assets/49549982/153cfaae-1d3d-462d-9645-ce125daeb206)
it show that:
export DATAX_HOME=/usr/xxx/xx/datax/bin/datax.py
export PATH=$PATH:$DATAX_HOME/bin
### **when i run a datax task it will quick error and the log show this:**
```
[INFO] 2024-01-20 17:20:47.536 +0800 - Executing shell command : sudo -u hadoop -i /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_6/44/64/44_64.sh
[INFO] 2024-01-20 17:20:47.538 +0800 - process start, process id is: 11883
[INFO] 2024-01-20 17:20:48.539 +0800 - ->
/tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_6/44/64/44_64.sh: line 14: --jvm=-Xms1G -Xmx2G: command not found
[INFO] 2024-01-20 17:20:48.540 +0800 - process has exited. execute path:/tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_6/44/64, processId:11883 ,exitStatusCode:127 ,processWaitForStatus:true ,processExitValue:127
```
here is all log
```
[LOG-PATH]: /usr/local/app/dolphinscheduler/worker-server/logs/20240120/12326923960544/8/46/67.log, [HOST]: Host(ip=192.168.10.12, port=1234)
[INFO] 2024-01-20 17:46:35.422 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.426 +0800 - ********************************* Initialize task context ***********************************
[INFO] 2024-01-20 17:46:35.426 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.426 +0800 - Begin to initialize task
[INFO] 2024-01-20 17:46:35.426 +0800 - Set task startTime: 1705743995426
[INFO] 2024-01-20 17:46:35.426 +0800 - Set task appId: 46_67
[INFO] 2024-01-20 17:46:35.427 +0800 - End initialize task {
"taskInstanceId" : 67,
"taskName" : "倒入当日门户数据",
"firstSubmitTime" : 1705743995357,
"startTime" : 1705743995426,
"taskType" : "DATAX",
"workflowInstanceHost" : "192.168.10.11:5678",
"host" : "192.168.10.12:1234",
"logPath" : "/usr/local/app/dolphinscheduler/worker-server/logs/20240120/12326923960544/8/46/67.log",
"processId" : 0,
"processDefineCode" : 12326923960544,
"processDefineVersion" : 8,
"processInstanceId" : 46,
"scheduleTime" : 0,
"executorId" : 1,
"cmdTypeIfComplement" : 0,
"tenantCode" : "hadoop",
"processDefineId" : 0,
"projectId" : 0,
"projectCode" : 12305939127488,
"taskParams" : "{\"localParams\":[],\"resourceList\":[],\"customConfig\":1,\"json\":\"{\\n\\t\\\"job\\\": {\\n\\t\\t\\\"content\\\": [{\\n\\t\\t\\t\\\"reader\\\": {\\n\\t\\t\\t\\t\\\"name\\\": \\\"mysqlreader\\\",\\n\\t\\t\\t\\t\\\"parameter\\\": {\\n\\t\\t\\t\\t\\t\\\"connection\\\": [{\\n\\t\\t\\t\\t\\t\\t\\\"jdbcUrl\\\": [\\n\\t\\t\\t\\t\\t\\t\\t\\\"jdbc:mysql://192.168.88.20:4000/wisdom_portal?useUnicode=true&zeroDateTimeBehavior=convertToNull&characterEncoding=UTF8&autoReconnect=true&useSSL=false&&allowLoadLocalInfile=false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\\\"\\n\\t\\t\\t\\t\\t\\t],\\n\\t\\t\\t\\t\\t\\t\\\"querySql\\\": [\\n\\t\\t\\t\\t\\t\\t\\t\\\"select 'db' as logger_prefix,log_type,log_title,login_ip,client_ip,user_agent,device_id,referer,origin,request_uri,method,request_method,params,exception,begin_time,end_time, t_logger.time_cost, t_logger.user_id, t_logger.username, t_logger.success, t_logger.result, t_logger.client_id from
t_logger where date_format(begin_time, \\\\\\\"%Y%m%d\\\\\\\") = \\\\\\\"20240120\\\\\\\";\\\"\\n\\t\\t\\t\\t\\t\\t]\\n\\t\\t\\t\\t\\t}],\\n\\t\\t\\t\\t\\t\\\"password\\\": \\\"Root@123\\\",\\n\\t\\t\\t\\t\\t\\\"username\\\": \\\"root\\\"\\n\\t\\t\\t\\t}\\n\\t\\t\\t},\\n\\t\\t\\t\\\"writer\\\": {\\n\\t\\t\\t\\t\\\"name\\\": \\\"hdfswriter\\\",\\n\\t\\t\\t\\t\\\"parameter\\\": {\\n\\t\\t\\t\\t\\t\\\"defaultFS\\\": \\\"hdfs://node1:8020\\\",\\n\\t\\t\\t\\t\\t\\\"fileType\\\": \\\"text\\\",\\n\\t\\t\\t\\t\\t\\\"path\\\": \\\"/ods/mysql/20240120\\\",\\n\\t\\t\\t\\t\\t\\\"fileName\\\": \\\"logger_prefix-20240120\\\",\\n\\t\\t\\t\\t\\t\\\"column\\\": [{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"logger_prefix\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"logType\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\
\": \\\"logTitle\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"loginIp\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"String\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"clientIp\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"userAgent\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"deviceId\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"referer\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"origin\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t
\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"requestUri\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"method\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"requestMethod\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"params\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"exception\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"beginTime\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"endTime\
\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"timeCost\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"userId\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"username\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"success\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"result\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"clientId\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t}\\n\\t\\t\\t
\\t\\t],\\n\\t\\t\\t\\t\\t\\\"writeMode\\\": \\\"truncate\\\",\\n\\t\\t\\t\\t\\t\\\"fieldDelimiter\\\": \\\"\\\\u0001\\\",\\n\\t\\t\\t\\t\\t\\\"compress\\\": \\\"GZIP\\\"\\n\\t\\t\\t\\t}\\n\\t\\t\\t}\\n\\t\\t}],\\n\\t\\t\\\"setting\\\": {\\n\\t\\t\\t\\\"errorLimit\\\": {\\n\\t\\t\\t\\t\\\"percentage\\\": 0.02,\\n\\t\\t\\t\\t\\\"record\\\": 0\\n\\t\\t\\t},\\n\\t\\t\\t\\\"speed\\\": {\\n\\t\\t\\t\\t\\\"channel\\\": 1\\n\\t\\t\\t}\\n\\t\\t}\\n\\t}\\n}\",\"xms\":1,\"xmx\":1}",
"environmentConfig" : "export JAVA_HOME=/usr/local/app/java/jdk1.8.0_391\nexport HADOOP_HOME=/usr/local/app/hadoop/hadoop-3.3.6\nexport HIVE_HOME=/usr/local/app/hive/hive\nexport ZOOKEEPER_HOME=/usr/local/app/zookeeper/zookeeper-3.5.7\nexport FLUME_HOME=/usr/local/app/flume/flume-1.9.0\nexport KAFKA_HOME=/usr/local/app/kafka_2.13-3.6.1\nexport DATAX_HOME=/usr/local/app/datax\nexport SPARK_HOME=/usr/local/app/spark-3.4.2-bin-hadoop3\nexport PYTHON_HOME=/usr/local/app/python3\nexport PATH=$ZOOKEEPER_HOME/bin:$DATAX_HOME/bin:$SPARK_HOME/bin:$FLINK_HOME/bin:$HIVE_HOME/bin:$HADOOP_HOME/bin:$PYTHON_HOME/bin:$JAVA_HOME/bin:$KAFKA_HOME/bin:$FLUME_HOME/bin:$PATH",
"prepareParamsMap" : {
"system.task.definition.name" : {
"prop" : "system.task.definition.name",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "倒入当日门户数据"
},
"system.project.name" : {
"prop" : "system.project.name",
"direct" : "IN",
"type" : "VARCHAR",
"value" : null
},
"system.project.code" : {
"prop" : "system.project.code",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "12305939127488"
},
"system.workflow.instance.id" : {
"prop" : "system.workflow.instance.id",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "46"
},
"system.biz.curdate" : {
"prop" : "system.biz.curdate",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "20240120"
},
"system.biz.date" : {
"prop" : "system.biz.date",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "20240119"
},
"system.task.instance.id" : {
"prop" : "system.task.instance.id",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "67"
},
"system.workflow.definition.name" : {
"prop" : "system.workflow.definition.name",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "倒入门户数据-当日"
},
"system.task.definition.code" : {
"prop" : "system.task.definition.code",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "12326919305313"
},
"system.workflow.definition.code" : {
"prop" : "system.workflow.definition.code",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "12326923960544"
},
"system.datetime" : {
"prop" : "system.datetime",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "20240120174635"
}
},
"taskAppId" : "46_67",
"taskTimeout" : 2147483647,
"workerGroup" : "node3",
"delayTime" : 0,
"currentExecutionStatus" : "SUBMITTED_SUCCESS",
"resourceParametersHelper" : {
"resourceMap" : { }
},
"endTime" : 0,
"resources" : { },
"dryRun" : 0,
"paramsMap" : { },
"cpuQuota" : -1,
"memoryMax" : -1,
"testFlag" : 0,
"logBufferEnable" : false,
"dispatchFailTimes" : 0
}
[INFO] 2024-01-20 17:46:35.429 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.429 +0800 - ********************************* Load task instance plugin *********************************
[INFO] 2024-01-20 17:46:35.429 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.431 +0800 - Send task status RUNNING_EXECUTION master: 192.168.10.12:1234
[INFO] 2024-01-20 17:46:35.431 +0800 - TenantCode: hadoop check successfully
[INFO] 2024-01-20 17:46:35.432 +0800 - WorkflowInstanceExecDir: /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67 check successfully
[INFO] 2024-01-20 17:46:35.432 +0800 - Download resources: {} successfully
[INFO] 2024-01-20 17:46:35.433 +0800 - Download upstream files: [] successfully
[INFO] 2024-01-20 17:46:35.433 +0800 - Task plugin instance: DATAX create successfully
[INFO] 2024-01-20 17:46:35.434 +0800 - Initialize datax task params {
"localParams" : [ ],
"varPool" : null,
"customConfig" : 1,
"json" : "{\n\t\"job\": {\n\t\t\"content\": [{\n\t\t\t\"reader\": {\n\t\t\t\t\"name\": \"mysqlreader\",\n\t\t\t\t\"parameter\": {\n\t\t\t\t\t\"connection\": [{\n\t\t\t\t\t\t\"jdbcUrl\": [\n\t\t\t\t\t\t\t\"jdbc:mysql://192.168.88.20:4000/wisdom_portal?useUnicode=true&zeroDateTimeBehavior=convertToNull&characterEncoding=UTF8&autoReconnect=true&useSSL=false&&allowLoadLocalInfile=false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\"\n\t\t\t\t\t\t],\n\t\t\t\t\t\t\"querySql\": [\n\t\t\t\t\t\t\t\"select 'db' as logger_prefix,log_type,log_title,login_ip,client_ip,user_agent,device_id,referer,origin,request_uri,method,request_method,params,exception,begin_time,end_time, t_logger.time_cost, t_logger.user_id, t_logger.username, t_logger.success, t_logger.result, t_logger.client_id from t_logger where date_format(begin_time, \\\"%Y%m%d\\\") = \\\"20240120\\\";\"\n\t\t\t\t\t\t]\n\t\t\t\t\t}],\n\t\t\t\t\t\"password\": \"Root@123\",\n\t\t\t\t\t\"username\": \"root\"\
n\t\t\t\t}\n\t\t\t},\n\t\t\t\"writer\": {\n\t\t\t\t\"name\": \"hdfswriter\",\n\t\t\t\t\"parameter\": {\n\t\t\t\t\t\"defaultFS\": \"hdfs://node1:8020\",\n\t\t\t\t\t\"fileType\": \"text\",\n\t\t\t\t\t\"path\": \"/ods/mysql/20240120\",\n\t\t\t\t\t\"fileName\": \"logger_prefix-20240120\",\n\t\t\t\t\t\"column\": [{\n\t\t\t\t\t\t\t\"name\": \"logger_prefix\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"logType\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"logTitle\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"loginIp\",\n\t\t\t\t\t\t\t\"type\": \"String\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"clientIp\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"userAgent\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"deviceId\",\n\t\t\
t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"referer\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"origin\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"requestUri\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"method\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"requestMethod\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"params\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"exception\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"beginTime\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"endTime\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\
t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"timeCost\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"userId\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"username\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"success\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"result\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"clientId\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t}\n\t\t\t\t\t],\n\t\t\t\t\t\"writeMode\": \"truncate\",\n\t\t\t\t\t\"fieldDelimiter\": \"\\u0001\",\n\t\t\t\t\t\"compress\": \"GZIP\"\n\t\t\t\t}\n\t\t\t}\n\t\t}],\n\t\t\"setting\": {\n\t\t\t\"errorLimit\": {\n\t\t\t\t\"percentage\": 0.02,\n\t\t\t\t\"record\": 0\n\t\t\t},\n\t\t\t\"speed\": {\n\t\t\t\t\"channel\": 1\n\t\t\t}\n\t\t}\n\t}\n}",
"dsType" : null,
"dataSource" : 0,
"dtType" : null,
"dataTarget" : 0,
"sql" : null,
"targetTable" : null,
"preStatements" : null,
"postStatements" : null,
"jobSpeedByte" : 0,
"jobSpeedRecord" : 0,
"xms" : 1,
"xmx" : 1,
"resourceList" : [ ]
}
[INFO] 2024-01-20 17:46:35.435 +0800 - Success initialized task plugin instance successfully
[INFO] 2024-01-20 17:46:35.435 +0800 - Set taskVarPool: null successfully
[INFO] 2024-01-20 17:46:35.435 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.435 +0800 - ********************************* Execute task instance *************************************
[INFO] 2024-01-20 17:46:35.435 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.437 +0800 - Final Shell file is :
#!/bin/bash
BASEDIR=$(cd `dirname $0`; pwd)
cd $BASEDIR
export JAVA_HOME=/usr/local/app/java/jdk1.8.0_391
export HADOOP_HOME=/usr/local/app/hadoop/hadoop-3.3.6
export HIVE_HOME=/usr/local/app/hive/hive
export ZOOKEEPER_HOME=/usr/local/app/zookeeper/zookeeper-3.5.7
export FLUME_HOME=/usr/local/app/flume/flume-1.9.0
export KAFKA_HOME=/usr/local/app/kafka_2.13-3.6.1
export DATAX_HOME=/usr/local/app/datax
export SPARK_HOME=/usr/local/app/spark-3.4.2-bin-hadoop3
export PYTHON_HOME=/usr/local/app/python3
export PATH=$ZOOKEEPER_HOME/bin:$DATAX_HOME/bin:$SPARK_HOME/bin:$FLINK_HOME/bin:$HIVE_HOME/bin:$HADOOP_HOME/bin:$PYTHON_HOME/bin:$JAVA_HOME/bin:$KAFKA_HOME/bin:$FLUME_HOME/bin:$PATH
${PYTHON_LAUNCHER} ${DATAX_LAUNCHER} --jvm="-Xms1G -Xmx1G" -p "-Dsystem.task.definition.name='倒入当日门户数据' -Dsystem.project.name='null' -Dsystem.project.code='12305939127488' -Dsystem.workflow.instance.id='46' -Dsystem.biz.curdate='20240120' -Dsystem.biz.date='20240119' -Dsystem.task.instance.id='67' -Dsystem.workflow.definition.name='倒入门户数据-当日' -Dsystem.task.definition.code='12326919305313' -Dsystem.workflow.definition.code='12326923960544' -Dsystem.datetime='20240120174635'" /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67/46_67_job.json
[INFO] 2024-01-20 17:46:35.438 +0800 - Executing shell command : sudo -u hadoop -i /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67/46_67.sh
[INFO] 2024-01-20 17:46:35.440 +0800 - process start, process id is: 13094
[INFO] 2024-01-20 17:46:36.442 +0800 - ->
/tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67/46_67.sh: line 14: --jvm=-Xms1G -Xmx1G: command not found
[INFO] 2024-01-20 17:46:36.444 +0800 - process has exited. execute path:/tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67, processId:13094 ,exitStatusCode:127 ,processWaitForStatus:true ,processExitValue:127
[INFO] 2024-01-20 17:46:36.445 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:36.445 +0800 - ********************************* Finalize task instance ************************************
[INFO] 2024-01-20 17:46:36.446 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:36.447 +0800 - Upload output files: [] successfully
[INFO] 2024-01-20 17:46:36.448 +0800 - Send task execute status: FAILURE to master : 192.168.10.12:1234
[INFO] 2024-01-20 17:46:36.448 +0800 - Remove the current task execute context from worker cache
[INFO] 2024-01-20 17:46:36.448 +0800 - The current execute mode isn't develop mode, will clear the task execute file: /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67
[INFO] 2024-01-20 17:46:36.449 +0800 - Success clear the task execute file: /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67
[INFO] 2024-01-20 17:46:36.449 +0800 - FINALIZE_SESSION
```
### What you expected to happen
maybe python3? or envirement?
### How to reproduce
here is my dolphinscheduler_env.sh
![image](https://github.com/apache/dolphinscheduler/assets/49549982/4d96892d-8cf5-49e1-9554-2e2c1c6d18f4)
here is my flow stream task:
![image](https://github.com/apache/dolphinscheduler/assets/49549982/b75df715-4930-44d4-86d1-f5fe87a8a898)
![image](https://github.com/apache/dolphinscheduler/assets/49549982/c68f885d-5471-464f-a8e6-305a17e0b934)
the job.json i already test on datax comand is ok
### Anything else
_No response_
### Version
3.2.x
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@dolphinscheduler.apache.org.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
Re: [I] when use datax task occer line 14: --jvm=-Xms1G -Xmx2G: command not found [dolphinscheduler]
Posted by "lanss315425 (via GitHub)" <gi...@apache.org>.
lanss315425 commented on issue #15515:
URL: https://github.com/apache/dolphinscheduler/issues/15515#issuecomment-2084818264
@majingxuan123 你最终是咋么解决的啊?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@dolphinscheduler.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
Re: [I] when use datax task occer line 14: --jvm=-Xms1G -Xmx2G: command not found [dolphinscheduler]
Posted by "github-actions[bot] (via GitHub)" <gi...@apache.org>.
github-actions[bot] commented on issue #15515:
URL: https://github.com/apache/dolphinscheduler/issues/15515#issuecomment-1902054568
### Search before asking
- [X] I had searched in the [issues](https://github.com/apache/dolphinscheduler/issues?q=is%3Aissue) and found no similar issues.
### What happened
from log file it maybe not work yet just envirement ?
but the web sit guide may be is wrong?
![image](https://github.com/apache/dolphinscheduler/assets/49549982/153cfaae-1d3d-462d-9645-ce125daeb206)
it show that:
export DATAX_HOME=/usr/xxx/xx/datax/bin/datax.py
export PATH=$PATH:$DATAX_HOME/bin
### **when i run a datax task it will quick error and the log show this:**
```
[INFO] 2024-01-20 17:20:47.536 +0800 - Executing shell command : sudo -u hadoop -i /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_6/44/64/44_64.sh
[INFO] 2024-01-20 17:20:47.538 +0800 - process start, process id is: 11883
[INFO] 2024-01-20 17:20:48.539 +0800 - ->
/tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_6/44/64/44_64.sh: line 14: --jvm=-Xms1G -Xmx2G: command not found
[INFO] 2024-01-20 17:20:48.540 +0800 - process has exited. execute path:/tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_6/44/64, processId:11883 ,exitStatusCode:127 ,processWaitForStatus:true ,processExitValue:127
```
here is all log
```
[LOG-PATH]: /usr/local/app/dolphinscheduler/worker-server/logs/20240120/12326923960544/8/46/67.log, [HOST]: Host(ip=192.168.10.12, port=1234)
[INFO] 2024-01-20 17:46:35.422 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.426 +0800 - ********************************* Initialize task context ***********************************
[INFO] 2024-01-20 17:46:35.426 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.426 +0800 - Begin to initialize task
[INFO] 2024-01-20 17:46:35.426 +0800 - Set task startTime: 1705743995426
[INFO] 2024-01-20 17:46:35.426 +0800 - Set task appId: 46_67
[INFO] 2024-01-20 17:46:35.427 +0800 - End initialize task {
"taskInstanceId" : 67,
"taskName" : "倒入当日门户数据",
"firstSubmitTime" : 1705743995357,
"startTime" : 1705743995426,
"taskType" : "DATAX",
"workflowInstanceHost" : "192.168.10.11:5678",
"host" : "192.168.10.12:1234",
"logPath" : "/usr/local/app/dolphinscheduler/worker-server/logs/20240120/12326923960544/8/46/67.log",
"processId" : 0,
"processDefineCode" : 12326923960544,
"processDefineVersion" : 8,
"processInstanceId" : 46,
"scheduleTime" : 0,
"executorId" : 1,
"cmdTypeIfComplement" : 0,
"tenantCode" : "hadoop",
"processDefineId" : 0,
"projectId" : 0,
"projectCode" : 12305939127488,
"taskParams" : "{\"localParams\":[],\"resourceList\":[],\"customConfig\":1,\"json\":\"{\\n\\t\\\"job\\\": {\\n\\t\\t\\\"content\\\": [{\\n\\t\\t\\t\\\"reader\\\": {\\n\\t\\t\\t\\t\\\"name\\\": \\\"mysqlreader\\\",\\n\\t\\t\\t\\t\\\"parameter\\\": {\\n\\t\\t\\t\\t\\t\\\"connection\\\": [{\\n\\t\\t\\t\\t\\t\\t\\\"jdbcUrl\\\": [\\n\\t\\t\\t\\t\\t\\t\\t\\\"jdbc:mysql://192.168.88.20:4000/wisdom_portal?useUnicode=true&zeroDateTimeBehavior=convertToNull&characterEncoding=UTF8&autoReconnect=true&useSSL=false&&allowLoadLocalInfile=false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\\\"\\n\\t\\t\\t\\t\\t\\t],\\n\\t\\t\\t\\t\\t\\t\\\"querySql\\\": [\\n\\t\\t\\t\\t\\t\\t\\t\\\"select 'db' as logger_prefix,log_type,log_title,login_ip,client_ip,user_agent,device_id,referer,origin,request_uri,method,request_method,params,exception,begin_time,end_time, t_logger.time_cost, t_logger.user_id, t_logger.username, t_logger.success, t_logger.result, t_logger.client_id from
t_logger where date_format(begin_time, \\\\\\\"%Y%m%d\\\\\\\") = \\\\\\\"20240120\\\\\\\";\\\"\\n\\t\\t\\t\\t\\t\\t]\\n\\t\\t\\t\\t\\t}],\\n\\t\\t\\t\\t\\t\\\"password\\\": \\\"Root@123\\\",\\n\\t\\t\\t\\t\\t\\\"username\\\": \\\"root\\\"\\n\\t\\t\\t\\t}\\n\\t\\t\\t},\\n\\t\\t\\t\\\"writer\\\": {\\n\\t\\t\\t\\t\\\"name\\\": \\\"hdfswriter\\\",\\n\\t\\t\\t\\t\\\"parameter\\\": {\\n\\t\\t\\t\\t\\t\\\"defaultFS\\\": \\\"hdfs://node1:8020\\\",\\n\\t\\t\\t\\t\\t\\\"fileType\\\": \\\"text\\\",\\n\\t\\t\\t\\t\\t\\\"path\\\": \\\"/ods/mysql/20240120\\\",\\n\\t\\t\\t\\t\\t\\\"fileName\\\": \\\"logger_prefix-20240120\\\",\\n\\t\\t\\t\\t\\t\\\"column\\\": [{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"logger_prefix\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"logType\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\
\": \\\"logTitle\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"loginIp\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"String\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"clientIp\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"userAgent\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"deviceId\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"referer\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"origin\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t
\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"requestUri\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"method\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"requestMethod\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"params\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"exception\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"beginTime\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"endTime\
\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"timeCost\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"userId\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"username\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"success\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"result\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\t\\\"name\\\": \\\"clientId\\\",\\n\\t\\t\\t\\t\\t\\t\\t\\\"type\\\": \\\"string\\\"\\n\\t\\t\\t\\t\\t\\t}\\n\\t\\t\\t
\\t\\t],\\n\\t\\t\\t\\t\\t\\\"writeMode\\\": \\\"truncate\\\",\\n\\t\\t\\t\\t\\t\\\"fieldDelimiter\\\": \\\"\\\\u0001\\\",\\n\\t\\t\\t\\t\\t\\\"compress\\\": \\\"GZIP\\\"\\n\\t\\t\\t\\t}\\n\\t\\t\\t}\\n\\t\\t}],\\n\\t\\t\\\"setting\\\": {\\n\\t\\t\\t\\\"errorLimit\\\": {\\n\\t\\t\\t\\t\\\"percentage\\\": 0.02,\\n\\t\\t\\t\\t\\\"record\\\": 0\\n\\t\\t\\t},\\n\\t\\t\\t\\\"speed\\\": {\\n\\t\\t\\t\\t\\\"channel\\\": 1\\n\\t\\t\\t}\\n\\t\\t}\\n\\t}\\n}\",\"xms\":1,\"xmx\":1}",
"environmentConfig" : "export JAVA_HOME=/usr/local/app/java/jdk1.8.0_391\nexport HADOOP_HOME=/usr/local/app/hadoop/hadoop-3.3.6\nexport HIVE_HOME=/usr/local/app/hive/hive\nexport ZOOKEEPER_HOME=/usr/local/app/zookeeper/zookeeper-3.5.7\nexport FLUME_HOME=/usr/local/app/flume/flume-1.9.0\nexport KAFKA_HOME=/usr/local/app/kafka_2.13-3.6.1\nexport DATAX_HOME=/usr/local/app/datax\nexport SPARK_HOME=/usr/local/app/spark-3.4.2-bin-hadoop3\nexport PYTHON_HOME=/usr/local/app/python3\nexport PATH=$ZOOKEEPER_HOME/bin:$DATAX_HOME/bin:$SPARK_HOME/bin:$FLINK_HOME/bin:$HIVE_HOME/bin:$HADOOP_HOME/bin:$PYTHON_HOME/bin:$JAVA_HOME/bin:$KAFKA_HOME/bin:$FLUME_HOME/bin:$PATH",
"prepareParamsMap" : {
"system.task.definition.name" : {
"prop" : "system.task.definition.name",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "倒入当日门户数据"
},
"system.project.name" : {
"prop" : "system.project.name",
"direct" : "IN",
"type" : "VARCHAR",
"value" : null
},
"system.project.code" : {
"prop" : "system.project.code",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "12305939127488"
},
"system.workflow.instance.id" : {
"prop" : "system.workflow.instance.id",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "46"
},
"system.biz.curdate" : {
"prop" : "system.biz.curdate",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "20240120"
},
"system.biz.date" : {
"prop" : "system.biz.date",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "20240119"
},
"system.task.instance.id" : {
"prop" : "system.task.instance.id",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "67"
},
"system.workflow.definition.name" : {
"prop" : "system.workflow.definition.name",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "倒入门户数据-当日"
},
"system.task.definition.code" : {
"prop" : "system.task.definition.code",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "12326919305313"
},
"system.workflow.definition.code" : {
"prop" : "system.workflow.definition.code",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "12326923960544"
},
"system.datetime" : {
"prop" : "system.datetime",
"direct" : "IN",
"type" : "VARCHAR",
"value" : "20240120174635"
}
},
"taskAppId" : "46_67",
"taskTimeout" : 2147483647,
"workerGroup" : "node3",
"delayTime" : 0,
"currentExecutionStatus" : "SUBMITTED_SUCCESS",
"resourceParametersHelper" : {
"resourceMap" : { }
},
"endTime" : 0,
"resources" : { },
"dryRun" : 0,
"paramsMap" : { },
"cpuQuota" : -1,
"memoryMax" : -1,
"testFlag" : 0,
"logBufferEnable" : false,
"dispatchFailTimes" : 0
}
[INFO] 2024-01-20 17:46:35.429 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.429 +0800 - ********************************* Load task instance plugin *********************************
[INFO] 2024-01-20 17:46:35.429 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.431 +0800 - Send task status RUNNING_EXECUTION master: 192.168.10.12:1234
[INFO] 2024-01-20 17:46:35.431 +0800 - TenantCode: hadoop check successfully
[INFO] 2024-01-20 17:46:35.432 +0800 - WorkflowInstanceExecDir: /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67 check successfully
[INFO] 2024-01-20 17:46:35.432 +0800 - Download resources: {} successfully
[INFO] 2024-01-20 17:46:35.433 +0800 - Download upstream files: [] successfully
[INFO] 2024-01-20 17:46:35.433 +0800 - Task plugin instance: DATAX create successfully
[INFO] 2024-01-20 17:46:35.434 +0800 - Initialize datax task params {
"localParams" : [ ],
"varPool" : null,
"customConfig" : 1,
"json" : "{\n\t\"job\": {\n\t\t\"content\": [{\n\t\t\t\"reader\": {\n\t\t\t\t\"name\": \"mysqlreader\",\n\t\t\t\t\"parameter\": {\n\t\t\t\t\t\"connection\": [{\n\t\t\t\t\t\t\"jdbcUrl\": [\n\t\t\t\t\t\t\t\"jdbc:mysql://192.168.88.20:4000/wisdom_portal?useUnicode=true&zeroDateTimeBehavior=convertToNull&characterEncoding=UTF8&autoReconnect=true&useSSL=false&&allowLoadLocalInfile=false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\"\n\t\t\t\t\t\t],\n\t\t\t\t\t\t\"querySql\": [\n\t\t\t\t\t\t\t\"select 'db' as logger_prefix,log_type,log_title,login_ip,client_ip,user_agent,device_id,referer,origin,request_uri,method,request_method,params,exception,begin_time,end_time, t_logger.time_cost, t_logger.user_id, t_logger.username, t_logger.success, t_logger.result, t_logger.client_id from t_logger where date_format(begin_time, \\\"%Y%m%d\\\") = \\\"20240120\\\";\"\n\t\t\t\t\t\t]\n\t\t\t\t\t}],\n\t\t\t\t\t\"password\": \"Root@123\",\n\t\t\t\t\t\"username\": \"root\"\
n\t\t\t\t}\n\t\t\t},\n\t\t\t\"writer\": {\n\t\t\t\t\"name\": \"hdfswriter\",\n\t\t\t\t\"parameter\": {\n\t\t\t\t\t\"defaultFS\": \"hdfs://node1:8020\",\n\t\t\t\t\t\"fileType\": \"text\",\n\t\t\t\t\t\"path\": \"/ods/mysql/20240120\",\n\t\t\t\t\t\"fileName\": \"logger_prefix-20240120\",\n\t\t\t\t\t\"column\": [{\n\t\t\t\t\t\t\t\"name\": \"logger_prefix\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"logType\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"logTitle\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"loginIp\",\n\t\t\t\t\t\t\t\"type\": \"String\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"clientIp\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"userAgent\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"deviceId\",\n\t\t\
t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"referer\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"origin\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"requestUri\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"method\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"requestMethod\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"params\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"exception\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"beginTime\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"endTime\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\
t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"timeCost\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"userId\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"username\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"success\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"result\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\"name\": \"clientId\",\n\t\t\t\t\t\t\t\"type\": \"string\"\n\t\t\t\t\t\t}\n\t\t\t\t\t],\n\t\t\t\t\t\"writeMode\": \"truncate\",\n\t\t\t\t\t\"fieldDelimiter\": \"\\u0001\",\n\t\t\t\t\t\"compress\": \"GZIP\"\n\t\t\t\t}\n\t\t\t}\n\t\t}],\n\t\t\"setting\": {\n\t\t\t\"errorLimit\": {\n\t\t\t\t\"percentage\": 0.02,\n\t\t\t\t\"record\": 0\n\t\t\t},\n\t\t\t\"speed\": {\n\t\t\t\t\"channel\": 1\n\t\t\t}\n\t\t}\n\t}\n}",
"dsType" : null,
"dataSource" : 0,
"dtType" : null,
"dataTarget" : 0,
"sql" : null,
"targetTable" : null,
"preStatements" : null,
"postStatements" : null,
"jobSpeedByte" : 0,
"jobSpeedRecord" : 0,
"xms" : 1,
"xmx" : 1,
"resourceList" : [ ]
}
[INFO] 2024-01-20 17:46:35.435 +0800 - Success initialized task plugin instance successfully
[INFO] 2024-01-20 17:46:35.435 +0800 - Set taskVarPool: null successfully
[INFO] 2024-01-20 17:46:35.435 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.435 +0800 - ********************************* Execute task instance *************************************
[INFO] 2024-01-20 17:46:35.435 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:35.437 +0800 - Final Shell file is :
#!/bin/bash
BASEDIR=$(cd `dirname $0`; pwd)
cd $BASEDIR
export JAVA_HOME=/usr/local/app/java/jdk1.8.0_391
export HADOOP_HOME=/usr/local/app/hadoop/hadoop-3.3.6
export HIVE_HOME=/usr/local/app/hive/hive
export ZOOKEEPER_HOME=/usr/local/app/zookeeper/zookeeper-3.5.7
export FLUME_HOME=/usr/local/app/flume/flume-1.9.0
export KAFKA_HOME=/usr/local/app/kafka_2.13-3.6.1
export DATAX_HOME=/usr/local/app/datax
export SPARK_HOME=/usr/local/app/spark-3.4.2-bin-hadoop3
export PYTHON_HOME=/usr/local/app/python3
export PATH=$ZOOKEEPER_HOME/bin:$DATAX_HOME/bin:$SPARK_HOME/bin:$FLINK_HOME/bin:$HIVE_HOME/bin:$HADOOP_HOME/bin:$PYTHON_HOME/bin:$JAVA_HOME/bin:$KAFKA_HOME/bin:$FLUME_HOME/bin:$PATH
${PYTHON_LAUNCHER} ${DATAX_LAUNCHER} --jvm="-Xms1G -Xmx1G" -p "-Dsystem.task.definition.name='倒入当日门户数据' -Dsystem.project.name='null' -Dsystem.project.code='12305939127488' -Dsystem.workflow.instance.id='46' -Dsystem.biz.curdate='20240120' -Dsystem.biz.date='20240119' -Dsystem.task.instance.id='67' -Dsystem.workflow.definition.name='倒入门户数据-当日' -Dsystem.task.definition.code='12326919305313' -Dsystem.workflow.definition.code='12326923960544' -Dsystem.datetime='20240120174635'" /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67/46_67_job.json
[INFO] 2024-01-20 17:46:35.438 +0800 - Executing shell command : sudo -u hadoop -i /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67/46_67.sh
[INFO] 2024-01-20 17:46:35.440 +0800 - process start, process id is: 13094
[INFO] 2024-01-20 17:46:36.442 +0800 - ->
/tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67/46_67.sh: line 14: --jvm=-Xms1G -Xmx1G: command not found
[INFO] 2024-01-20 17:46:36.444 +0800 - process has exited. execute path:/tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67, processId:13094 ,exitStatusCode:127 ,processWaitForStatus:true ,processExitValue:127
[INFO] 2024-01-20 17:46:36.445 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:36.445 +0800 - ********************************* Finalize task instance ************************************
[INFO] 2024-01-20 17:46:36.446 +0800 - ***********************************************************************************************
[INFO] 2024-01-20 17:46:36.447 +0800 - Upload output files: [] successfully
[INFO] 2024-01-20 17:46:36.448 +0800 - Send task execute status: FAILURE to master : 192.168.10.12:1234
[INFO] 2024-01-20 17:46:36.448 +0800 - Remove the current task execute context from worker cache
[INFO] 2024-01-20 17:46:36.448 +0800 - The current execute mode isn't develop mode, will clear the task execute file: /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67
[INFO] 2024-01-20 17:46:36.449 +0800 - Success clear the task execute file: /tmp/dolphinscheduler/exec/process/hadoop/12305939127488/12326923960544_8/46/67
[INFO] 2024-01-20 17:46:36.449 +0800 - FINALIZE_SESSION
```
### What you expected to happen
maybe python3? or envirement?
### How to reproduce
here is my dolphinscheduler_env.sh
![image](https://github.com/apache/dolphinscheduler/assets/49549982/4d96892d-8cf5-49e1-9554-2e2c1c6d18f4)
here is my flow stream task:
![image](https://github.com/apache/dolphinscheduler/assets/49549982/b75df715-4930-44d4-86d1-f5fe87a8a898)
![image](https://github.com/apache/dolphinscheduler/assets/49549982/c68f885d-5471-464f-a8e6-305a17e0b934)
the job.json i already test on datax comand is ok
### Anything else
_No response_
### Version
3.2.x
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@dolphinscheduler.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
Re: [I] when use datax task occer line 14: --jvm=-Xms1G -Xmx2G: command not found [dolphinscheduler]
Posted by "majingxuan123 (via GitHub)" <gi...@apache.org>.
majingxuan123 closed issue #15515: when use datax task occer line 14: --jvm=-Xms1G -Xmx2G: command not found
URL: https://github.com/apache/dolphinscheduler/issues/15515
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@dolphinscheduler.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
Re: [I] when use datax task occer line 14: --jvm=-Xms1G -Xmx2G: command not found [dolphinscheduler]
Posted by "majingxuan123 (via GitHub)" <gi...@apache.org>.
majingxuan123 commented on issue #15515:
URL: https://github.com/apache/dolphinscheduler/issues/15515#issuecomment-2087970172
机器内存不够、提高内存
------------------ 原始邮件 ------------------
发件人: ***@***.***>;
发送时间: 2024年4月30日(星期二) 下午5:27
收件人: ***@***.***>;
抄送: ***@***.***>; ***@***.***>;
主题: Re: [apache/dolphinscheduler] when use datax task occer line 14: --jvm=-Xms1G -Xmx2G: command not found (Issue #15515)
@majingxuan123 你最终是咋么解决的啊?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: ***@***.***>
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@dolphinscheduler.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org