You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@dolphinscheduler.apache.org by GitBox <gi...@apache.org> on 2020/06/02 10:35:27 UTC

[GitHub] [incubator-dolphinscheduler] hei-wei opened a new issue #2877: java.lang.RuntimeException: core-site.xml not found

hei-wei opened a new issue #2877:
URL: https://github.com/apache/incubator-dolphinscheduler/issues/2877


   *For better global communication, please give priority to using English description, thx! *
   
   **Describe the question**
   A clear and concise description of what the question is.
   
   
   **Which version of DolphinScheduler:**
    -[1.1.0-preview]
   
   **Additional context**
   Add any other context about the problem here.
   
   **Requirement or improvement
   - Please describe about your requirements or improvement suggestions.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-dolphinscheduler] hei-wei removed a comment on issue #2877: java.lang.RuntimeException: core-site.xml not found

Posted by GitBox <gi...@apache.org>.
hei-wei removed a comment on issue #2877:
URL: https://github.com/apache/incubator-dolphinscheduler/issues/2877#issuecomment-637457310


   hive表字段hive_columns_list:['INSTU_CDE', 'APPL_SEQ', 'APPL_CDE', 'APPLY_DT', 'BCH_CDE', 'FORM', 'CUST_ID', 'ID_TYP', 'ID_TYP_OTH', 'ID_NO', 'CUST_NAME', 'MSS_IND', 'REJ_IND', 'LOAN_TYP', 'TYP_SEQ', 'TYP_VER', 'LOAN_PROM', 'PRO_PUR_AMT', 'FST_PCT', 'FST_PAY', 'LOAN_CCY', 'PURPOSE', 'OTHER_PURPOSE', 'APPLY_AMT', 'APPRV_AMT', 'APPLY_TNR', 'APPRV_TNR', 'LOAN_FREQ', 'MTH_AMT', 'DSR_RATIO', 'COLL_RATIO', 'WF_APPR_STS', 'APP_ORIGIN', 'DOC_CHANNEL', 'SCORE', 'SCORE_GRADE', 'COMPL_RESULT', 'SIGN_DT', 'CONT_NO', 'COException in thread "main" java.lang.RuntimeException: core-site.xml not found
   [INFO] 2020-06-02 18:40:40.519  - [taskAppId=TASK-8-181-221]:[106] -  -> 	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2577)
   		at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503)
   		at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409)
   		at org.apache.hadoop.conf.Configuration.set(Configuration.java:1144)
   		at org.apache.hadoop.conf.Configuration.set(Configuration.java:1116)
   		at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1454)
   		at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:319)
   		at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:485)
   		at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
   		at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
   		at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)
   		at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
   		at org.apache.hadoop.fs.FsShell.main(FsShell.java:372)
   	NT_AMT', 'MTD_CDE', 'RATE_TYP', 'MTD_MODE', 'REPC_OPT', 'BASIC_INT_RAT', 'PRICE_INT_RAT', 'SUPER_COOPR', 'COOPR_CDE', 'COOPR_NAME', 'COOPR_ZONE', 'COOPR_TEL', 'COOPR_SUB', 'SALER_NAME', 'SALER_MOBILE', 'OPERATOR_CDE', 'OPERATOR_NAME', 'OPERATOR_TEL', 'CRT_USR', 'CRT_DT', 'LAST_CHG_DT', 'LAST_CHG_USR', 'TYP_GRP', 'GUTR_OPT', 'CRT_BCH', 'LOAN_OPT', 'IN_STS', 'OUT_STS', 'VEH_CHASSIS', 'APPLY_TNR_TYP', 'APPRV_TNR_TYP', 'CRT_BCH_IND', 'RATE_RULE', 'DUE_DAY_OPT', 'DUE_DAY', 'APP_IN_ADVICE', 'APPLY_FST_PAY', 'APPLY_FST_PCT', 'AUTO_WF_IND', 'ACCT_OPT', 'ADVICE_CDE', 'IS_UPLOAD_CONT', 'IS_UPLOAD_LOAN', 'RISK_FLAG', 'MARKETING_CHANNEL', 'CODE_WORD', 'FLOAT_MODE', 'RISK_TIMES', 'RISK_ADVICE', 'NEW_CUST', 'RISK_AMT', 'GOODS_NUMB_POOL', 'GOODS_PROVINCE', 'GOODS_CITY', 'GOODS_AREA', 'GOODS_ADDR', 'SALER_REP_ID', 'SALER_REP_NAME', 'SALER_REP_MOBILE', 'REFEREE_NAME', 'REFEREE_NUIT', 'REFEREE_MOBILE', 'LEARN_ZY_MODE', 'OTHER_BANK_INSTU_DESC', 'OTHER_DESC', 'ADVICE_AMT', 'INCOME_AMT', 'SYS_FLAG', 'COOPR_CONT_NO', 'PAY_KIND', 'PAY_DT', 'QUICK_PAY_FLAG', 'PAY_TERM', 'SERNO', 'LMT_INIT_AMT', 'AUTO_MTD_CDE', 'AUTO_TNR_TYP', 'AUTO_TNR', 'EXCEP_IF', 'EXCEP_REASON', 'SIGN_MODEL', 'CANCLE_RESON', 'IS_OFTEN_CNO', 'OFTEN_CARD_NO', 'IS_UPLOAD_BILL', 'RISK_PRE_LEVEL', 'AUTO_RATE_MODE', 'AUTO_FIXED_OD_IND', 'AUTO_OD_INT_RATE', 'AUTO_INT_RAT', 'PLOAN_TNR_TYP', 'PLOAN_TNR', 'ADMIT_SERNO', 'REMARKS', 'APPLY_TYPE', 'CONSUME_SUM', 'APPLICANT_MOBILE', 'CONSUME_CURR_SUM', 'COOPPF_APPL_CDE', 'IS_REAL_APPLY', 'SPECIAL_CONT_FLAG', 'RISK_COEFFICIENT', 'CASH_AMT', 'SUP_CREDIT_CARD', 'IS_SIGLE_REPLAN', 'LOAN_EFFECT_POINT', 'IS_UPLOAD_DOC', 'CREDIT_CHECK_SIGN', 'CREDIT_CHECK_TIME', 'SERVER_NODE', 'IS_WHITE_LIST', 'WHITE_TYPE', 'IS_ADVICE_FREEZE', 'FREEZE_FLAG_TYP', 'FREEZ_REASON_CUST', 'CUST_TYPE', 'MARKET_CHANNEL', 'PACKAGE_CHANNEL', 'ANNUAL_INCOME_CAL', 'CUST_LEVEL', 'INDENT_TYPE', 'WL_COMPANY', 'CREDIT_RESULT', 'AC_INDENT_TYPE', 'ACCOUNT_MANAGER_LOGIN', 'PBOC_FLAG', 'THIRD_APPROVAL_FLAG', 'THIRD_APPROVAL_RESULT', 'THIRD_APPROVAL_DATE', 'GZ_PRICE_INT_RAT', 'REFEREE_NO', 'RISK_TERM_SPCL_PROC_FLAG', 'ADVICE_AMORT_TERM', 'LPR_RATE_TYPE', 'LPR_ISSUE_MAPRICE', 'LPR_FLOAT_RATE', 'LPR_ISSUE_DATE', 'PRODUCT_CHANNEL', 'SYS_CHANNEL', 'MANAGER_BRANCH'] len:179
   	
   	源表LC_APPL与目标表SDATA.S001_LC_APPL字段个数和字段名一致。
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:35] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	源表LC_APPL与目标表SDATA.S001_LC_APPL字段个数和字段名一致。
   	执行之前检查并删除临时数据文件命令:  hadoop fs -rm -r /user/taskctl/CMIS.LC_APPL
   	执行之前检查并删除临时数据文件
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:35] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	执行之前检查并删除临时数据文件命令: hadoop fs -rm -r /user/taskctl/CMIS.LC_APPL
   	 sqoop import --connect "jdbc:oracle:thin:@10.94.30.50:1521:cmis" --username="cmis" --password="cmis" --outdir \/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/tmp/.sqoop/java/ --table "CMIS.LC_APPL" --columns "INSTU_CDE,APPL_SEQ,APPL_CDE,APPLY_DT,BCH_CDE,FORM,CUST_ID,ID_TYP,ID_TYP_OTH,ID_NO,CUST_NAME,MSS_IND,REJ_IND,LOAN_TYP,TYP_SEQ,TYP_VER,LOAN_PROM,PRO_PUR_AMT,FST_PCT,FST_PAY,LOAN_CCY,PURPOSE,OTHER_PURPOSE,APPLY_AMT,APPRV_AMT,APPLY_TNR,APPRV_TNR,LOAN_FREQ,MTH_AMT,DSR_RATIO,COLL_RATIO,WF_APPR_STS,APP_ORIGIN,DOC_CHANNEL,SCORE,SCORE_GRADE,COMPL_RESULT,SIGN_DT,CONT_NO,CONT_AMT,MTD_CDE,RATE_TYP,MTD_MODE,REPC_OPT,BASIC_INT_RAT,PRICE_INT_RAT,SUPER_COOPR,COOPR_CDE,COOPR_NAME,COOPR_ZONE,COOPR_TEL,COOPR_SUB,SALER_NAME,SALER_MOBILE,OPERATOR_CDE,OPERATOR_NAME,OPERATOR_TEL,CRT_USR,CRT_DT,LAST_CHG_DT,LAST_CHG_USR,TYP_GRP,GUTR_OPT,CRT_BCH,LOAN_OPT,IN_STS,OUT_STS,VEH_CHASSIS,APPLY_TNR_TYP,APPRV_TNR_TYP,CRT_BCH_IND,RATE_RULE,DUE_DAY_OPT,DUE_DAY,APP_IN_ADVICE,APPLY_FST_PAY,APPLY_FST_PCT,AUTO_WF_IND,ACCT_OPT,ADVICE_CDE,IS_UPLOAD_CONT,IS_UPLOAD_LOAN,RISK_FLAG,MARKETING_CHANNEL,CODE_WORD,FLOAT_MODE,RISK_TIMES,RISK_ADVICE,NEW_CUST,RISK_AMT,GOODS_NUMB_POOL,GOODS_PROVINCE,GOODS_CITY,GOODS_AREA,GOODS_ADDR,SALER_REP_ID,SALER_REP_NAME,SALER_REP_MOBILE,REFEREE_NAME,REFEREE_NUIT,REFEREE_MOBILE,LEARN_ZY_MODE,OTHER_BANK_INSTU_DESC,OTHER_DESC,ADVICE_AMT,INCOME_AMT,SYS_FLAG,COOPR_CONT_NO,PAY_KIND,PAY_DT,QUICK_PAY_FLAG,PAY_TERM,SERNO,LMT_INIT_AMT,AUTO_MTD_CDE,AUTO_TNR_TYP,AUTO_TNR,EXCEP_IF,EXCEP_REASON,SIGN_MODEL,CANCLE_RESON,IS_OFTEN_CNO,OFTEN_CARD_NO,IS_UPLOAD_BILL,RISK_PRE_LEVEL,AUTO_RATE_MODE,AUTO_FIXED_OD_IND,AUTO_OD_INT_RATE,AUTO_INT_RAT,PLOAN_TNR_TYP,PLOAN_TNR,ADMIT_SERNO,REMARKS,APPLY_TYPE,CONSUME_SUM,APPLICANT_MOBILE,CONSUME_CURR_SUM,COOPPF_APPL_CDE,IS_REAL_APPLY,SPECIAL_CONT_FLAG,RISK_COEFFICIENT,CASH_AMT,SUP_CREDIT_CARD,IS_SIGLE_REPLAN,LOAN_EFFECT_POINT,IS_UPLOAD_DOC,CREDIT_CHECK_SIGN,CREDIT_CHECK_TIME,SERVER_NODE,IS_WHITE_LIST,WHITE_TYPE,IS_ADVICE_FREEZE,FREEZE_FLAG_TYP,FREEZ_REASON_CUST,CUST_TYPE,MARKET_CHANNEL,PACKAGE_CHANNEL,ANNUAL_INCOME_CAL,CUST_LEVEL,INDENT_TYPE,WL_COMPANY,CREDIT_RESULT,AC_INDENT_TYPE,ACCOUNT_MANAGER_LOGIN,PBOC_FLAG,THIRD_APPROVAL_FLAG,THIRD_APPROVAL_RESULT,THIRD_APPROVAL_DATE,GZ_PRICE_INT_RAT,REFEREE_NO,RISK_TERM_SPCL_PROC_FLAG,ADVICE_AMORT_TERM,LPR_RATE_TYPE,LPR_ISSUE_MAPRICE,LPR_FLOAT_RATE,LPR_ISSUE_DATE,PRODUCT_CHANNEL,SYS_CHANNEL,MANAGER_BRANCH" -m 1 --hive-import --hive-overwrite --hive-table "SDATA.S001_LC_APPL" --hive-partition-key "dt" --hive-partition-value "20230825" --hive-drop-import-delims  --fields-terminated-by "^" --lines-terminated-by "\n" --null-string "\\\N" --null-non-string "\\\N"
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:35] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	执行全量抽取数据命令: sqoop import --connect "jdbc:oracle:thin:@10.94.30.50:1521:cmis" --username="cmis" --password="cmis" --outdir \/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/tmp/.sqoop/java/ --table "CMIS.LC_APPL" --columns "INSTU_CDE,APPL_SEQ,APPL_CDE,APPLY_DT,BCH_CDE,FORM,CUST_ID,ID_TYP,ID_TYP_OTH,ID_NO,CUST_NAME,MSS_IND,REJ_IND,LOAN_TYP,TYP_SEQ,TYP_VER,LOAN_PROM,PRO_PUR_AMT,FST_PCT,FST_PAY,LOAN_CCY,PURPOSE,OTHER_PURPOSE,APPLY_AMT,APPRV_AMT,APPLY_TNR,APPRV_TNR,LOAN_FREQ,MTH_AMT,DSR_RATIO,COLL_RATIO,WF_APPR_STS,APP_ORIGIN,DOC_CHANNEL,SCORE,SCORE_GRADE,COMPL_RESULT,SIGN_DT,CONT_NO,CONT_AMT,MTD_CDE,RATE_TYP,MTD_MODE,REPC_OPT,BASIC_INT_RAT,PRICE_INT_RAT,SUPER_COOPR,COOPR_CDE,COOPR_NAME,COOPR_ZONE,COOPR_TEL,COOPR_SUB,SALER_NAME,SALER_MOBILE,OPERATOR_CDE,OPERATOR_NAME,OPERATOR_TEL,CRT_USR,CRT_DT,LAST_CHG_DT,LAST_CHG_USR,TYP_GRP,GUTR_OPT,CRT_BCH,LOAN_OPT,IN_STS,OUT_STS,VEH_CHASSIS,APPLY_TNR_TYP,APPRV_TNR_TYP,CRT_BCH_IND,RATE_RULE,DUE_DAY_OPT,DUE_DAY,APP_IN_ADVICE,APPLY_FST_PAY,APPLY_FST_PCT,AUTO_WF_IND,ACCT_OPT,ADVICE_CDE,IS_UPLOAD_CONT,IS_UPLOAD_LOAN,RISK_FLAG,MARKETING_CHANNEL,CODE_WORD,FLOAT_MODE,RISK_TIMES,RISK_ADVICE,NEW_CUST,RISK_AMT,GOODS_NUMB_POOL,GOODS_PROVINCE,GOODS_CITY,GOODS_AREA,GOODS_ADDR,SALER_REP_ID,SALER_REP_NAME,SALER_REP_MOBILE,REFEREE_NAME,REFEREE_NUIT,REFEREE_MOBILE,LEARN_ZY_MODE,OTHER_BANK_INSTU_DESC,OTHER_DESC,ADVICE_AMT,INCOME_AMT,SYS_FLAG,COOPR_CONT_NO,PAY_KIND,PAY_DT,QUICK_PAY_FLAG,PAY_TERM,SERNO,LMT_INIT_AMT,AUTO_MTD_CDE,AUTO_TNR_TYP,AUTO_TNR,EXCEP_IF,EXCEP_REASON,SIGN_MODEL,CANCLE_RESON,IS_OFTEN_CNO,OFTEN_CARD_NO,IS_UPLOAD_BILL,RISK_PRE_LEVEL,AUTO_RATE_MODE,AUTO_FIXED_OD_IND,AUTO_OD_INT_RATE,AUTO_INT_RAT,PLOAN_TNR_TYP,PLOAN_TNR,ADMIT_SERNO,REMARKS,APPLY_TYPE,CONSUME_SUM,APPLICANT_MOBILE,CONSUME_CURR_SUM,COOPPF_APPL_CDE,IS_REAL_APPLY,SPECIAL_CONT_FLAG,RISK_COEFFICIENT,CASH_AMT,SUP_CREDIT_CARD,IS_SIGLE_REPLAN,LOAN_EFFECT_POINT,IS_UPLOAD_DOC,CREDIT_CHECK_SIGN,CREDIT_CHECK_TIME,SERVER_NODE,IS_WHITE_LIST,WHITE_TYPE,IS_ADVICE_FREEZE,FREEZE_FLAG_TYP,FREEZ_REASON_CUST,CUST_TYPE,MARKET_CHANNEL,PACKAGE_CHANNEL,ANNUAL_INCOME_CAL,CUST_LEVEL,INDENT_TYPE,WL_COMPANY,CREDIT_RESULT,AC_INDENT_TYPE,ACCOUNT_MANAGER_LOGIN,PBOC_FLAG,THIRD_APPROVAL_FLAG,THIRD_APPROVAL_RESULT,THIRD_APPROVAL_DATE,GZ_PRICE_INT_RAT,REFEREE_NO,RISK_TERM_SPCL_PROC_FLAG,ADVICE_AMORT_TERM,LPR_RATE_TYPE,LPR_ISSUE_MAPRICE,LPR_FLOAT_RATE,LPR_ISWarning: /opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
   [INFO] 2020-06-02 18:40:41.924  - [taskAppId=TASK-8-181-221]:[106] -  -> Please set $ACCUMULO_HOME to the root of your Accumulo installation.
   	20/06/02 18:40:41 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.11.0
   [INFO] 2020-06-02 18:40:47.819  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:40:41 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
   	20/06/02 18:40:42 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
   	20/06/02 18:40:42 INFO manager.SqlManager: Using default fetchSize of 1000
   	20/06/02 18:40:42 INFO tool.CodeGenTool: Beginning code generation
   	20/06/02 18:40:47 INFO manager.OracleManager: Time zone has been set to GMT
   [INFO] 2020-06-02 18:40:50.180  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:40:47 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM CMIS.LC_APPL t WHERE 1=0
   	20/06/02 18:40:47 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34
   	Note: /tmp/sqoop-taskctl/compile/9d598051260fa086176de9cf3c454c60/CMIS_LC_APPL.java uses or overrides a deprecated API.
   [INFO] 2020-06-02 18:40:52.307  - [taskAppId=TASK-8-181-221]:[106] -  -> Note: Recompile with -Xlint:deprecation for details.
   	20/06/02 18:40:50 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-taskctl/compile/9d598051260fa086176de9cf3c454c60/CMIS.LC_APPL.jar
   	20/06/02 18:40:50 INFO manager.OracleManager: Time zone has been set to GMT
   	20/06/02 18:40:50 INFO mapreduce.ImportJobBase: Beginning import of CMIS.LC_APPL
   	20/06/02 18:40:50 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
   	20/06/02 18:40:50 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
   	20/06/02 18:40:51 INFO client.RMProxy: Connecting to ResourceManager at vm15211/10.94.152.11:8032
   	20/06/02 18:40:52 INFO db.DBInputFormat: Using read commited transaction isolation
   [INFO] 2020-06-02 18:40:57.730  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:40:52 INFO mapreduce.JobSubmitter: number of splits:1
   	20/06/02 18:40:52 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1587388830783_1324
   	20/06/02 18:40:52 INFO impl.YarnClientImpl: Submitted application application_1587388830783_1324
   	20/06/02 18:40:52 INFO mapreduce.Job: The url to track the job: http://vm15211:8088/proxy/application_1587388830783_1324/
   	20/06/02 18:40:52 INFO mapreduce.Job: Running job: job_1587388830783_1324
   	20/06/02 18:40:57 INFO mapreduce.Job: Job job_1587388830783_1324 running in uber mode : false
   [INFO] 2020-06-02 18:41:10.802  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:40:57 INFO mapreduce.Job:  map 0% reduce 0%
   	20/06/02 18:41:10 INFO mapreduce.Job:  map 100% reduce 0%
   [INFO] 2020-06-02 18:41:11.810  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:41:11 INFO mapreduce.Job: Job job_1587388830783_1324 completed successfully
   [INFO] 2020-06-02 18:41:11.927  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:41:11 INFO mapreduce.Job: Counters: 32
   		File System Counters
   			FILE: Number of bytes read=0
   			FILE: Number of bytes written=158024
   			FILE: Number of read operations=0
   			FILE: Number of large read operations=0
   			FILE: Number of write operations=0
   			HDFS: Number of bytes read=87
   			HDFS: Number of bytes written=21403878
   			HDFS: Number of read operations=4
   			HDFS: Number of large read operations=0
   			HDFS: Number of write operations=2
   		Job Counters 
   			Launched map tasks=1
   			Other local map tasks=1
   			Total time spent by all maps in occupied slots (ms)=20142
   			Total time spent by all reduces in occupied slots (ms)=0
   			Total time spent by all map tasks (ms)=10071
   			Total vcore-milliseconds taken by all map tasks=10071
   			Total megabyte-milliseconds taken by all map tasks=15469056
   		Map-Reduce Framework
   			Map input records=25900
   			Map output records=25900
   			Input split bytes=87
   			Spilled Records=0
   			Failed Shuffles=0
   			Merged Map outputs=0
   			GC time elapsed (ms)=135
   			CPU time spent (ms)=6170
   			Physical memory (bytes) snapshot=478605312
   			Virtual memory (bytes) snapshot=3037671424
   			Total committed heap usage (bytes)=487587840
   			Peak Map Physical memory (bytes)=478605312
   			Peak Map Virtual memory (bytes)=3037671424
   		File Input Format Counters 
   			Bytes Read=0
   		File Output Format Counters 
   			Bytes Written=21403878
   	20/06/02 18:41:11 INFO mapreduce.ImportJobBase: Transferred 20.4123 MB in 20.9098 seconds (999.639 KB/sec)
   	20/06/02 18:41:11 INFO mapreduce.ImportJobBase: Retrieved 25900 records.
   	20/06/02 18:41:11 INFO manager.OracleManager: Time zone has been set to GMT
   	20/06/02 18:41:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM CMIS.LC_APPL t WHERE 1=0
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPL_SEQ had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column TYP_SEQ had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column TYP_VER had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column PRO_PUR_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column FST_PCT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column FST_PAY had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPLY_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPRV_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column MTH_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column DSR_RATIO had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column COLL_RATIO had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column SCORE had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column CONT_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column BASIC_INT_RAT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column PRICE_INT_RAT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPLY_FST_PAY had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPLY_FST_PCT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column RISK_TIMES had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column RISK_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column GOODS_NUMB_POOL had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column ADVICE_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column INCOME_AMT had to be cast to a less precise type in Hive
   [INFO] 2020-06-02 18:41:15.004  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:41:11 WARN hive.TableDefWriter: Column LMT_INIT_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column AUTO_OD_INT_RATE had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column AUTO_INT_RAT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column CONSUME_SUM had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column CONSUME_CURR_SUM had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column RISK_COEFFICIENT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column CASH_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column ANNUAL_INCOME_CAL had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 INFO hive.HiveImport: Loading uploaded data into Hive
   	
   	Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/jars/hive-common-1.1.0-cdh5.11.0.jar!/hive-log4j.properties
   	OK
   [INFO] 2020-06-02 18:41:21.063  - [taskAppId=TASK-8-181-221]:[106] -  -> Time taken: 2.599 seconds
   	Loading data to table sdata.s001_lc_appl partition (dt=20230825)
   	setfacl: Permission denied. user=taskctl is not the owner of inode=dt=20230825
   	setfacl: Permission denied. user=taskctl is not the owner of inode=dt=20230825
   	Partition sdata.s001_lc_appl{dt=20230825} stats: [numFiles=1, numRows=0, totalSize=21403878, rawDataSize=0]
   	OK
   	Time taken: 0.844 seconds
   	**Exception in thread "main" java.lang.RuntimeException: core-site.xml not found
   [INFO] 2020-06-02 18:41:21.104  - [taskAppId=TASK-8-181-221]:[106] -  -> 	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2577)**
   		at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503)
   		at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409)
   		at org.apache.hadoop.conf.Configuration.set(Configuration.java:1144)
   		at org.apache.hadoop.conf.Configuration.set(Configuration.java:1116)
   		at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1454)
   		at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:319)
   		at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:485)
   		at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
   		at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
   		at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)
   		at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
   		at org.apache.hadoop.fs.FsShell.main(FsShell.java:372)
   	SUE_DATE,PRODUCT_CHANNEL,SYS_CHANNEL,MANAGER_BRANCH" -m 1 --hive-import --hive-overwrite --hive-table "SDATA.S001_LC_APPL" --hive-partition-key "dt" --hive-partition-value "20230825" --hive-drop-import-delims  --fields-terminated-by "^" --lines-terminated-by "\n" --null-string "\\\N" --null-non-string "\\\N"
   	刷新元数据完成:SDATA.S001_LC_APPL
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:35] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	刷新元数据完成:SDATA.S001_LC_APPL
   	alter table SDATA.S001_LC_APPL drop if exists partition(dt='20230825')
   	执行drop掉已存在的分区完成:SDATA.S001_LC_APPL
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:40] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	alter table SDATA.S001_LC_APPL drop if exists partition(dt='20230825')
   	执行drop掉已存在的分区完成:SDATA.S001_LC_APPL
   	timer start.command timeout = 62400
   	刷新元数据完成:SDATA.S001_LC_APPL
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:16] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	刷新元数据完成:SDATA.S001_LC_APPL
   	all end
   	sql :  select count(1) from CMIS.LC_APPL
   	jdbc:oracle:thin:@10.94.30.50:1521:cmis
   	cmis/cmis@10.94.30.50:1521/cmis
   	EXEC SQL:select count(1) from CMIS.LC_APPL
   	*************
   	SQL ROWNUM:0 row(s)
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:16] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	select count(1) from CMIS.LC_APPL
   	检查源系统表LC_APPL行数:25900
   	sql :  select count(1) from SDATA.S001_LC_APPL where dt='20230825'
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:20] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	select count(1) from SDATA.S001_LC_APPL where dt='20230825'
   	检查目标系统表SDATA.S001_LC_APPL行数:25900
   	记录数比较阈值: max = 0
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:20] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	比较源系统表和目标表行数
   	compareSourceAndTarget success.
   	LC_APPL:25900
   	SDATA.S001_LC_APPL:25900
   	compareSourceAndTarget success.
   	sTableNum: 25900
   	tTableNum: 25900
   	检查字段是否错位... checkColumnsCmd: hadoop fs -cat /user/hive/warehouse/sdata.db/s001_lc_appl/dt=20230825/p* |awk -F '^' '{print NF}' |grep -v 179 |wc -l
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:20] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 检查字段是否错位... checkColumnsCmd: hadoop fs -cat /user/hive/warehouse/sdata.db/s001_lc_appl/dt=20230825/p* |awk -F '^' '{print NF}' |grep -v 179 |wc -l
   	字段错位检查通过! checkColumns success. res: 0
   	
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:21] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 字段错位检查通过! checkColumns success. res: 0
   	
   	input ID: SDATA.S001_LC_APPL
   [INFO] 2020-06-02 18:41:21.127  - [taskAppId=TASK-8-181-221]:[106] -  -> genDependence sql: INSERT INTO ETL_JOB_RUN_HIS VALUES(
   	                'SDATA.S001_LC_APPL', 'SDATA', 'S001', 'LC_APPL', 'SDATA.S001_LC_APPL', '/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/SDATA/source_to_sdata.py', 'SDATA.S001_LC_APPL', '20230825', '0', '2020-06-02 18:41:21', '', ''
   	              );
   	          
   	genDependence successful. 已生成作业依赖!
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:21] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	生成作业依赖:SDATA.S001_LC_APPL,SDATA,S001,LC_APPL,SDATA.S001_LC_APPL,/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/SDATA/source_to_sdata.py,SDATA.S001_LC_APPL,20230825,0,2020-06-02 18:41:21,,
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:21] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	S001_LC_APPL抽取数执行成功!
   	
   	##################################################
   	S001_LC_APPL抽取数执行成功!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-dolphinscheduler] dailidong commented on issue #2877: java.lang.RuntimeException: core-site.xml not found

Posted by GitBox <gi...@apache.org>.
dailidong commented on issue #2877:
URL: https://github.com/apache/incubator-dolphinscheduler/issues/2877#issuecomment-637453092


   please paste detail exception info 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-dolphinscheduler] hei-wei commented on issue #2877: java.lang.RuntimeException: core-site.xml not found

Posted by GitBox <gi...@apache.org>.
hei-wei commented on issue #2877:
URL: https://github.com/apache/incubator-dolphinscheduler/issues/2877#issuecomment-637471207


    The problem has been solved
   because  
   HADOOP_HOME=
   HADOOP_CONF_DIR=
   Just set it right


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-dolphinscheduler] hei-wei closed issue #2877: java.lang.RuntimeException: core-site.xml not found

Posted by GitBox <gi...@apache.org>.
hei-wei closed issue #2877:
URL: https://github.com/apache/incubator-dolphinscheduler/issues/2877


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-dolphinscheduler] hei-wei commented on issue #2877: java.lang.RuntimeException: core-site.xml not found

Posted by GitBox <gi...@apache.org>.
hei-wei commented on issue #2877:
URL: https://github.com/apache/incubator-dolphinscheduler/issues/2877#issuecomment-637457310


   hive表字段hive_columns_list:['INSTU_CDE', 'APPL_SEQ', 'APPL_CDE', 'APPLY_DT', 'BCH_CDE', 'FORM', 'CUST_ID', 'ID_TYP', 'ID_TYP_OTH', 'ID_NO', 'CUST_NAME', 'MSS_IND', 'REJ_IND', 'LOAN_TYP', 'TYP_SEQ', 'TYP_VER', 'LOAN_PROM', 'PRO_PUR_AMT', 'FST_PCT', 'FST_PAY', 'LOAN_CCY', 'PURPOSE', 'OTHER_PURPOSE', 'APPLY_AMT', 'APPRV_AMT', 'APPLY_TNR', 'APPRV_TNR', 'LOAN_FREQ', 'MTH_AMT', 'DSR_RATIO', 'COLL_RATIO', 'WF_APPR_STS', 'APP_ORIGIN', 'DOC_CHANNEL', 'SCORE', 'SCORE_GRADE', 'COMPL_RESULT', 'SIGN_DT', 'CONT_NO', 'COException in thread "main" java.lang.RuntimeException: core-site.xml not found
   [INFO] 2020-06-02 18:40:40.519  - [taskAppId=TASK-8-181-221]:[106] -  -> 	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2577)
   		at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503)
   		at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409)
   		at org.apache.hadoop.conf.Configuration.set(Configuration.java:1144)
   		at org.apache.hadoop.conf.Configuration.set(Configuration.java:1116)
   		at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1454)
   		at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:319)
   		at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:485)
   		at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
   		at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
   		at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)
   		at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
   		at org.apache.hadoop.fs.FsShell.main(FsShell.java:372)
   	NT_AMT', 'MTD_CDE', 'RATE_TYP', 'MTD_MODE', 'REPC_OPT', 'BASIC_INT_RAT', 'PRICE_INT_RAT', 'SUPER_COOPR', 'COOPR_CDE', 'COOPR_NAME', 'COOPR_ZONE', 'COOPR_TEL', 'COOPR_SUB', 'SALER_NAME', 'SALER_MOBILE', 'OPERATOR_CDE', 'OPERATOR_NAME', 'OPERATOR_TEL', 'CRT_USR', 'CRT_DT', 'LAST_CHG_DT', 'LAST_CHG_USR', 'TYP_GRP', 'GUTR_OPT', 'CRT_BCH', 'LOAN_OPT', 'IN_STS', 'OUT_STS', 'VEH_CHASSIS', 'APPLY_TNR_TYP', 'APPRV_TNR_TYP', 'CRT_BCH_IND', 'RATE_RULE', 'DUE_DAY_OPT', 'DUE_DAY', 'APP_IN_ADVICE', 'APPLY_FST_PAY', 'APPLY_FST_PCT', 'AUTO_WF_IND', 'ACCT_OPT', 'ADVICE_CDE', 'IS_UPLOAD_CONT', 'IS_UPLOAD_LOAN', 'RISK_FLAG', 'MARKETING_CHANNEL', 'CODE_WORD', 'FLOAT_MODE', 'RISK_TIMES', 'RISK_ADVICE', 'NEW_CUST', 'RISK_AMT', 'GOODS_NUMB_POOL', 'GOODS_PROVINCE', 'GOODS_CITY', 'GOODS_AREA', 'GOODS_ADDR', 'SALER_REP_ID', 'SALER_REP_NAME', 'SALER_REP_MOBILE', 'REFEREE_NAME', 'REFEREE_NUIT', 'REFEREE_MOBILE', 'LEARN_ZY_MODE', 'OTHER_BANK_INSTU_DESC', 'OTHER_DESC', 'ADVICE_AMT', 'INCOME_AMT', 'SYS_FLAG', 'COOPR_CONT_NO', 'PAY_KIND', 'PAY_DT', 'QUICK_PAY_FLAG', 'PAY_TERM', 'SERNO', 'LMT_INIT_AMT', 'AUTO_MTD_CDE', 'AUTO_TNR_TYP', 'AUTO_TNR', 'EXCEP_IF', 'EXCEP_REASON', 'SIGN_MODEL', 'CANCLE_RESON', 'IS_OFTEN_CNO', 'OFTEN_CARD_NO', 'IS_UPLOAD_BILL', 'RISK_PRE_LEVEL', 'AUTO_RATE_MODE', 'AUTO_FIXED_OD_IND', 'AUTO_OD_INT_RATE', 'AUTO_INT_RAT', 'PLOAN_TNR_TYP', 'PLOAN_TNR', 'ADMIT_SERNO', 'REMARKS', 'APPLY_TYPE', 'CONSUME_SUM', 'APPLICANT_MOBILE', 'CONSUME_CURR_SUM', 'COOPPF_APPL_CDE', 'IS_REAL_APPLY', 'SPECIAL_CONT_FLAG', 'RISK_COEFFICIENT', 'CASH_AMT', 'SUP_CREDIT_CARD', 'IS_SIGLE_REPLAN', 'LOAN_EFFECT_POINT', 'IS_UPLOAD_DOC', 'CREDIT_CHECK_SIGN', 'CREDIT_CHECK_TIME', 'SERVER_NODE', 'IS_WHITE_LIST', 'WHITE_TYPE', 'IS_ADVICE_FREEZE', 'FREEZE_FLAG_TYP', 'FREEZ_REASON_CUST', 'CUST_TYPE', 'MARKET_CHANNEL', 'PACKAGE_CHANNEL', 'ANNUAL_INCOME_CAL', 'CUST_LEVEL', 'INDENT_TYPE', 'WL_COMPANY', 'CREDIT_RESULT', 'AC_INDENT_TYPE', 'ACCOUNT_MANAGER_LOGIN', 'PBOC_FLAG', 'THIRD_APPROVAL_FLAG', 'THIRD_APPROVAL_RESULT', 'THIRD_APPROVAL_DATE', 'GZ_PRICE_INT_RAT', 'REFEREE_NO', 'RISK_TERM_SPCL_PROC_FLAG', 'ADVICE_AMORT_TERM', 'LPR_RATE_TYPE', 'LPR_ISSUE_MAPRICE', 'LPR_FLOAT_RATE', 'LPR_ISSUE_DATE', 'PRODUCT_CHANNEL', 'SYS_CHANNEL', 'MANAGER_BRANCH'] len:179
   	
   	源表LC_APPL与目标表SDATA.S001_LC_APPL字段个数和字段名一致。
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:35] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	源表LC_APPL与目标表SDATA.S001_LC_APPL字段个数和字段名一致。
   	执行之前检查并删除临时数据文件命令:  hadoop fs -rm -r /user/taskctl/CMIS.LC_APPL
   	执行之前检查并删除临时数据文件
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:35] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	执行之前检查并删除临时数据文件命令: hadoop fs -rm -r /user/taskctl/CMIS.LC_APPL
   	 sqoop import --connect "jdbc:oracle:thin:@10.94.30.50:1521:cmis" --username="cmis" --password="cmis" --outdir \/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/tmp/.sqoop/java/ --table "CMIS.LC_APPL" --columns "INSTU_CDE,APPL_SEQ,APPL_CDE,APPLY_DT,BCH_CDE,FORM,CUST_ID,ID_TYP,ID_TYP_OTH,ID_NO,CUST_NAME,MSS_IND,REJ_IND,LOAN_TYP,TYP_SEQ,TYP_VER,LOAN_PROM,PRO_PUR_AMT,FST_PCT,FST_PAY,LOAN_CCY,PURPOSE,OTHER_PURPOSE,APPLY_AMT,APPRV_AMT,APPLY_TNR,APPRV_TNR,LOAN_FREQ,MTH_AMT,DSR_RATIO,COLL_RATIO,WF_APPR_STS,APP_ORIGIN,DOC_CHANNEL,SCORE,SCORE_GRADE,COMPL_RESULT,SIGN_DT,CONT_NO,CONT_AMT,MTD_CDE,RATE_TYP,MTD_MODE,REPC_OPT,BASIC_INT_RAT,PRICE_INT_RAT,SUPER_COOPR,COOPR_CDE,COOPR_NAME,COOPR_ZONE,COOPR_TEL,COOPR_SUB,SALER_NAME,SALER_MOBILE,OPERATOR_CDE,OPERATOR_NAME,OPERATOR_TEL,CRT_USR,CRT_DT,LAST_CHG_DT,LAST_CHG_USR,TYP_GRP,GUTR_OPT,CRT_BCH,LOAN_OPT,IN_STS,OUT_STS,VEH_CHASSIS,APPLY_TNR_TYP,APPRV_TNR_TYP,CRT_BCH_IND,RATE_RULE,DUE_DAY_OPT,DUE_DAY,APP_IN_ADVICE,APPLY_FST_PAY,APPLY_FST_PCT,AUTO_WF_IND,ACCT_OPT,ADVICE_CDE,IS_UPLOAD_CONT,IS_UPLOAD_LOAN,RISK_FLAG,MARKETING_CHANNEL,CODE_WORD,FLOAT_MODE,RISK_TIMES,RISK_ADVICE,NEW_CUST,RISK_AMT,GOODS_NUMB_POOL,GOODS_PROVINCE,GOODS_CITY,GOODS_AREA,GOODS_ADDR,SALER_REP_ID,SALER_REP_NAME,SALER_REP_MOBILE,REFEREE_NAME,REFEREE_NUIT,REFEREE_MOBILE,LEARN_ZY_MODE,OTHER_BANK_INSTU_DESC,OTHER_DESC,ADVICE_AMT,INCOME_AMT,SYS_FLAG,COOPR_CONT_NO,PAY_KIND,PAY_DT,QUICK_PAY_FLAG,PAY_TERM,SERNO,LMT_INIT_AMT,AUTO_MTD_CDE,AUTO_TNR_TYP,AUTO_TNR,EXCEP_IF,EXCEP_REASON,SIGN_MODEL,CANCLE_RESON,IS_OFTEN_CNO,OFTEN_CARD_NO,IS_UPLOAD_BILL,RISK_PRE_LEVEL,AUTO_RATE_MODE,AUTO_FIXED_OD_IND,AUTO_OD_INT_RATE,AUTO_INT_RAT,PLOAN_TNR_TYP,PLOAN_TNR,ADMIT_SERNO,REMARKS,APPLY_TYPE,CONSUME_SUM,APPLICANT_MOBILE,CONSUME_CURR_SUM,COOPPF_APPL_CDE,IS_REAL_APPLY,SPECIAL_CONT_FLAG,RISK_COEFFICIENT,CASH_AMT,SUP_CREDIT_CARD,IS_SIGLE_REPLAN,LOAN_EFFECT_POINT,IS_UPLOAD_DOC,CREDIT_CHECK_SIGN,CREDIT_CHECK_TIME,SERVER_NODE,IS_WHITE_LIST,WHITE_TYPE,IS_ADVICE_FREEZE,FREEZE_FLAG_TYP,FREEZ_REASON_CUST,CUST_TYPE,MARKET_CHANNEL,PACKAGE_CHANNEL,ANNUAL_INCOME_CAL,CUST_LEVEL,INDENT_TYPE,WL_COMPANY,CREDIT_RESULT,AC_INDENT_TYPE,ACCOUNT_MANAGER_LOGIN,PBOC_FLAG,THIRD_APPROVAL_FLAG,THIRD_APPROVAL_RESULT,THIRD_APPROVAL_DATE,GZ_PRICE_INT_RAT,REFEREE_NO,RISK_TERM_SPCL_PROC_FLAG,ADVICE_AMORT_TERM,LPR_RATE_TYPE,LPR_ISSUE_MAPRICE,LPR_FLOAT_RATE,LPR_ISSUE_DATE,PRODUCT_CHANNEL,SYS_CHANNEL,MANAGER_BRANCH" -m 1 --hive-import --hive-overwrite --hive-table "SDATA.S001_LC_APPL" --hive-partition-key "dt" --hive-partition-value "20230825" --hive-drop-import-delims  --fields-terminated-by "^" --lines-terminated-by "\n" --null-string "\\\N" --null-non-string "\\\N"
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:35] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	执行全量抽取数据命令: sqoop import --connect "jdbc:oracle:thin:@10.94.30.50:1521:cmis" --username="cmis" --password="cmis" --outdir \/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/tmp/.sqoop/java/ --table "CMIS.LC_APPL" --columns "INSTU_CDE,APPL_SEQ,APPL_CDE,APPLY_DT,BCH_CDE,FORM,CUST_ID,ID_TYP,ID_TYP_OTH,ID_NO,CUST_NAME,MSS_IND,REJ_IND,LOAN_TYP,TYP_SEQ,TYP_VER,LOAN_PROM,PRO_PUR_AMT,FST_PCT,FST_PAY,LOAN_CCY,PURPOSE,OTHER_PURPOSE,APPLY_AMT,APPRV_AMT,APPLY_TNR,APPRV_TNR,LOAN_FREQ,MTH_AMT,DSR_RATIO,COLL_RATIO,WF_APPR_STS,APP_ORIGIN,DOC_CHANNEL,SCORE,SCORE_GRADE,COMPL_RESULT,SIGN_DT,CONT_NO,CONT_AMT,MTD_CDE,RATE_TYP,MTD_MODE,REPC_OPT,BASIC_INT_RAT,PRICE_INT_RAT,SUPER_COOPR,COOPR_CDE,COOPR_NAME,COOPR_ZONE,COOPR_TEL,COOPR_SUB,SALER_NAME,SALER_MOBILE,OPERATOR_CDE,OPERATOR_NAME,OPERATOR_TEL,CRT_USR,CRT_DT,LAST_CHG_DT,LAST_CHG_USR,TYP_GRP,GUTR_OPT,CRT_BCH,LOAN_OPT,IN_STS,OUT_STS,VEH_CHASSIS,APPLY_TNR_TYP,APPRV_TNR_TYP,CRT_BCH_IND,RATE_RULE,DUE_DAY_OPT,DUE_DAY,APP_IN_ADVICE,APPLY_FST_PAY,APPLY_FST_PCT,AUTO_WF_IND,ACCT_OPT,ADVICE_CDE,IS_UPLOAD_CONT,IS_UPLOAD_LOAN,RISK_FLAG,MARKETING_CHANNEL,CODE_WORD,FLOAT_MODE,RISK_TIMES,RISK_ADVICE,NEW_CUST,RISK_AMT,GOODS_NUMB_POOL,GOODS_PROVINCE,GOODS_CITY,GOODS_AREA,GOODS_ADDR,SALER_REP_ID,SALER_REP_NAME,SALER_REP_MOBILE,REFEREE_NAME,REFEREE_NUIT,REFEREE_MOBILE,LEARN_ZY_MODE,OTHER_BANK_INSTU_DESC,OTHER_DESC,ADVICE_AMT,INCOME_AMT,SYS_FLAG,COOPR_CONT_NO,PAY_KIND,PAY_DT,QUICK_PAY_FLAG,PAY_TERM,SERNO,LMT_INIT_AMT,AUTO_MTD_CDE,AUTO_TNR_TYP,AUTO_TNR,EXCEP_IF,EXCEP_REASON,SIGN_MODEL,CANCLE_RESON,IS_OFTEN_CNO,OFTEN_CARD_NO,IS_UPLOAD_BILL,RISK_PRE_LEVEL,AUTO_RATE_MODE,AUTO_FIXED_OD_IND,AUTO_OD_INT_RATE,AUTO_INT_RAT,PLOAN_TNR_TYP,PLOAN_TNR,ADMIT_SERNO,REMARKS,APPLY_TYPE,CONSUME_SUM,APPLICANT_MOBILE,CONSUME_CURR_SUM,COOPPF_APPL_CDE,IS_REAL_APPLY,SPECIAL_CONT_FLAG,RISK_COEFFICIENT,CASH_AMT,SUP_CREDIT_CARD,IS_SIGLE_REPLAN,LOAN_EFFECT_POINT,IS_UPLOAD_DOC,CREDIT_CHECK_SIGN,CREDIT_CHECK_TIME,SERVER_NODE,IS_WHITE_LIST,WHITE_TYPE,IS_ADVICE_FREEZE,FREEZE_FLAG_TYP,FREEZ_REASON_CUST,CUST_TYPE,MARKET_CHANNEL,PACKAGE_CHANNEL,ANNUAL_INCOME_CAL,CUST_LEVEL,INDENT_TYPE,WL_COMPANY,CREDIT_RESULT,AC_INDENT_TYPE,ACCOUNT_MANAGER_LOGIN,PBOC_FLAG,THIRD_APPROVAL_FLAG,THIRD_APPROVAL_RESULT,THIRD_APPROVAL_DATE,GZ_PRICE_INT_RAT,REFEREE_NO,RISK_TERM_SPCL_PROC_FLAG,ADVICE_AMORT_TERM,LPR_RATE_TYPE,LPR_ISSUE_MAPRICE,LPR_FLOAT_RATE,LPR_ISWarning: /opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
   [INFO] 2020-06-02 18:40:41.924  - [taskAppId=TASK-8-181-221]:[106] -  -> Please set $ACCUMULO_HOME to the root of your Accumulo installation.
   	20/06/02 18:40:41 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.11.0
   [INFO] 2020-06-02 18:40:47.819  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:40:41 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
   	20/06/02 18:40:42 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
   	20/06/02 18:40:42 INFO manager.SqlManager: Using default fetchSize of 1000
   	20/06/02 18:40:42 INFO tool.CodeGenTool: Beginning code generation
   	20/06/02 18:40:47 INFO manager.OracleManager: Time zone has been set to GMT
   [INFO] 2020-06-02 18:40:50.180  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:40:47 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM CMIS.LC_APPL t WHERE 1=0
   	20/06/02 18:40:47 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34
   	Note: /tmp/sqoop-taskctl/compile/9d598051260fa086176de9cf3c454c60/CMIS_LC_APPL.java uses or overrides a deprecated API.
   [INFO] 2020-06-02 18:40:52.307  - [taskAppId=TASK-8-181-221]:[106] -  -> Note: Recompile with -Xlint:deprecation for details.
   	20/06/02 18:40:50 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-taskctl/compile/9d598051260fa086176de9cf3c454c60/CMIS.LC_APPL.jar
   	20/06/02 18:40:50 INFO manager.OracleManager: Time zone has been set to GMT
   	20/06/02 18:40:50 INFO mapreduce.ImportJobBase: Beginning import of CMIS.LC_APPL
   	20/06/02 18:40:50 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
   	20/06/02 18:40:50 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
   	20/06/02 18:40:51 INFO client.RMProxy: Connecting to ResourceManager at vm15211/10.94.152.11:8032
   	20/06/02 18:40:52 INFO db.DBInputFormat: Using read commited transaction isolation
   [INFO] 2020-06-02 18:40:57.730  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:40:52 INFO mapreduce.JobSubmitter: number of splits:1
   	20/06/02 18:40:52 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1587388830783_1324
   	20/06/02 18:40:52 INFO impl.YarnClientImpl: Submitted application application_1587388830783_1324
   	20/06/02 18:40:52 INFO mapreduce.Job: The url to track the job: http://vm15211:8088/proxy/application_1587388830783_1324/
   	20/06/02 18:40:52 INFO mapreduce.Job: Running job: job_1587388830783_1324
   	20/06/02 18:40:57 INFO mapreduce.Job: Job job_1587388830783_1324 running in uber mode : false
   [INFO] 2020-06-02 18:41:10.802  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:40:57 INFO mapreduce.Job:  map 0% reduce 0%
   	20/06/02 18:41:10 INFO mapreduce.Job:  map 100% reduce 0%
   [INFO] 2020-06-02 18:41:11.810  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:41:11 INFO mapreduce.Job: Job job_1587388830783_1324 completed successfully
   [INFO] 2020-06-02 18:41:11.927  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:41:11 INFO mapreduce.Job: Counters: 32
   		File System Counters
   			FILE: Number of bytes read=0
   			FILE: Number of bytes written=158024
   			FILE: Number of read operations=0
   			FILE: Number of large read operations=0
   			FILE: Number of write operations=0
   			HDFS: Number of bytes read=87
   			HDFS: Number of bytes written=21403878
   			HDFS: Number of read operations=4
   			HDFS: Number of large read operations=0
   			HDFS: Number of write operations=2
   		Job Counters 
   			Launched map tasks=1
   			Other local map tasks=1
   			Total time spent by all maps in occupied slots (ms)=20142
   			Total time spent by all reduces in occupied slots (ms)=0
   			Total time spent by all map tasks (ms)=10071
   			Total vcore-milliseconds taken by all map tasks=10071
   			Total megabyte-milliseconds taken by all map tasks=15469056
   		Map-Reduce Framework
   			Map input records=25900
   			Map output records=25900
   			Input split bytes=87
   			Spilled Records=0
   			Failed Shuffles=0
   			Merged Map outputs=0
   			GC time elapsed (ms)=135
   			CPU time spent (ms)=6170
   			Physical memory (bytes) snapshot=478605312
   			Virtual memory (bytes) snapshot=3037671424
   			Total committed heap usage (bytes)=487587840
   			Peak Map Physical memory (bytes)=478605312
   			Peak Map Virtual memory (bytes)=3037671424
   		File Input Format Counters 
   			Bytes Read=0
   		File Output Format Counters 
   			Bytes Written=21403878
   	20/06/02 18:41:11 INFO mapreduce.ImportJobBase: Transferred 20.4123 MB in 20.9098 seconds (999.639 KB/sec)
   	20/06/02 18:41:11 INFO mapreduce.ImportJobBase: Retrieved 25900 records.
   	20/06/02 18:41:11 INFO manager.OracleManager: Time zone has been set to GMT
   	20/06/02 18:41:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM CMIS.LC_APPL t WHERE 1=0
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPL_SEQ had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column TYP_SEQ had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column TYP_VER had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column PRO_PUR_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column FST_PCT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column FST_PAY had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPLY_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPRV_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column MTH_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column DSR_RATIO had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column COLL_RATIO had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column SCORE had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column CONT_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column BASIC_INT_RAT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column PRICE_INT_RAT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPLY_FST_PAY had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column APPLY_FST_PCT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column RISK_TIMES had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column RISK_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column GOODS_NUMB_POOL had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column ADVICE_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column INCOME_AMT had to be cast to a less precise type in Hive
   [INFO] 2020-06-02 18:41:15.004  - [taskAppId=TASK-8-181-221]:[106] -  -> 20/06/02 18:41:11 WARN hive.TableDefWriter: Column LMT_INIT_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column AUTO_OD_INT_RATE had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column AUTO_INT_RAT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column CONSUME_SUM had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column CONSUME_CURR_SUM had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column RISK_COEFFICIENT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column CASH_AMT had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 WARN hive.TableDefWriter: Column ANNUAL_INCOME_CAL had to be cast to a less precise type in Hive
   	20/06/02 18:41:11 INFO hive.HiveImport: Loading uploaded data into Hive
   	
   	Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/jars/hive-common-1.1.0-cdh5.11.0.jar!/hive-log4j.properties
   	OK
   [INFO] 2020-06-02 18:41:21.063  - [taskAppId=TASK-8-181-221]:[106] -  -> Time taken: 2.599 seconds
   	Loading data to table sdata.s001_lc_appl partition (dt=20230825)
   	setfacl: Permission denied. user=taskctl is not the owner of inode=dt=20230825
   	setfacl: Permission denied. user=taskctl is not the owner of inode=dt=20230825
   	Partition sdata.s001_lc_appl{dt=20230825} stats: [numFiles=1, numRows=0, totalSize=21403878, rawDataSize=0]
   	OK
   	Time taken: 0.844 seconds
   	**Exception in thread "main" java.lang.RuntimeException: core-site.xml not found
   [INFO] 2020-06-02 18:41:21.104  - [taskAppId=TASK-8-181-221]:[106] -  -> 	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2577)**
   		at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503)
   		at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409)
   		at org.apache.hadoop.conf.Configuration.set(Configuration.java:1144)
   		at org.apache.hadoop.conf.Configuration.set(Configuration.java:1116)
   		at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1454)
   		at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:319)
   		at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:485)
   		at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
   		at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
   		at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)
   		at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
   		at org.apache.hadoop.fs.FsShell.main(FsShell.java:372)
   	SUE_DATE,PRODUCT_CHANNEL,SYS_CHANNEL,MANAGER_BRANCH" -m 1 --hive-import --hive-overwrite --hive-table "SDATA.S001_LC_APPL" --hive-partition-key "dt" --hive-partition-value "20230825" --hive-drop-import-delims  --fields-terminated-by "^" --lines-terminated-by "\n" --null-string "\\\N" --null-non-string "\\\N"
   	刷新元数据完成:SDATA.S001_LC_APPL
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:35] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	刷新元数据完成:SDATA.S001_LC_APPL
   	alter table SDATA.S001_LC_APPL drop if exists partition(dt='20230825')
   	执行drop掉已存在的分区完成:SDATA.S001_LC_APPL
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:40:40] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	alter table SDATA.S001_LC_APPL drop if exists partition(dt='20230825')
   	执行drop掉已存在的分区完成:SDATA.S001_LC_APPL
   	timer start.command timeout = 62400
   	刷新元数据完成:SDATA.S001_LC_APPL
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:16] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	刷新元数据完成:SDATA.S001_LC_APPL
   	all end
   	sql :  select count(1) from CMIS.LC_APPL
   	jdbc:oracle:thin:@10.94.30.50:1521:cmis
   	cmis/cmis@10.94.30.50:1521/cmis
   	EXEC SQL:select count(1) from CMIS.LC_APPL
   	*************
   	SQL ROWNUM:0 row(s)
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:16] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	select count(1) from CMIS.LC_APPL
   	检查源系统表LC_APPL行数:25900
   	sql :  select count(1) from SDATA.S001_LC_APPL where dt='20230825'
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:20] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	select count(1) from SDATA.S001_LC_APPL where dt='20230825'
   	检查目标系统表SDATA.S001_LC_APPL行数:25900
   	记录数比较阈值: max = 0
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:20] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	比较源系统表和目标表行数
   	compareSourceAndTarget success.
   	LC_APPL:25900
   	SDATA.S001_LC_APPL:25900
   	compareSourceAndTarget success.
   	sTableNum: 25900
   	tTableNum: 25900
   	检查字段是否错位... checkColumnsCmd: hadoop fs -cat /user/hive/warehouse/sdata.db/s001_lc_appl/dt=20230825/p* |awk -F '^' '{print NF}' |grep -v 179 |wc -l
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:20] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 检查字段是否错位... checkColumnsCmd: hadoop fs -cat /user/hive/warehouse/sdata.db/s001_lc_appl/dt=20230825/p* |awk -F '^' '{print NF}' |grep -v 179 |wc -l
   	字段错位检查通过! checkColumns success. res: 0
   	
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:21] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 字段错位检查通过! checkColumns success. res: 0
   	
   	input ID: SDATA.S001_LC_APPL
   [INFO] 2020-06-02 18:41:21.127  - [taskAppId=TASK-8-181-221]:[106] -  -> genDependence sql: INSERT INTO ETL_JOB_RUN_HIS VALUES(
   	                'SDATA.S001_LC_APPL', 'SDATA', 'S001', 'LC_APPL', 'SDATA.S001_LC_APPL', '/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/SDATA/source_to_sdata.py', 'SDATA.S001_LC_APPL', '20230825', '0', '2020-06-02 18:41:21', '', ''
   	              );
   	          
   	genDependence successful. 已生成作业依赖!
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:21] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	生成作业依赖:SDATA.S001_LC_APPL,SDATA,S001,LC_APPL,SDATA.S001_LC_APPL,/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/SDATA/source_to_sdata.py,SDATA.S001_LC_APPL,20230825,0,2020-06-02 18:41:21,,
   	/BIG_DATA/EDW/ZYXF_EDW/PYTHON_APP/log/20200602/source_to_sdata_S001_LC_APPL_20230825_0.log
   	 [2020-06-02 18:41:21] [INFO] [source_to_sdata.py S001 LC_APPL 20230825 0 null null null] 
   	S001_LC_APPL抽取数执行成功!
   	
   	##################################################
   	S001_LC_APPL抽取数执行成功!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org