You are viewing a plain text version of this content. The canonical link for it is here.
Posted to notifications@linkis.apache.org by "aiceflower (via GitHub)" <gi...@apache.org> on 2023/06/02 10:26:30 UTC

[GitHub] [linkis] aiceflower opened a new issue, #4596: [Bug] hadoop lzo-related error logs are generated after hive tasks are executed

aiceflower opened a new issue, #4596:
URL: https://github.com/apache/linkis/issues/4596

   ### Search before asking
   
   - [X] I searched the [issues](https://github.com/apache/linkis/issues) and found no similar issues.
   
   
   ### Linkis Component
   
   linkis-engineconnn-plugin
   
   ### Steps to reproduce
   
   linkis版本 1.4.0
   执行命令:linkis-cli -submitUser  hadoop  -engineType hive-3.1.3  -codeType hql  -code "show tables"
   问题描述:执行hive任务的时候,任务执行成功能成功返回结果,但是日志有错误输出,报 hadoop lzo 相关的包错误,java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec not found
   日志信息如下:
   =====Java Start Command=====
   exec /data/apps/jdk1.8.0_192/bin/java -server -Xms32m -Xmx2048m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/data/Install/LinkisInstall/linkis-dev-1.4.0.20230530-205658/LinkisInstall/logs/linkis-cli -XX:ErrorFile=/data/Install/LinkisInstall/linkis-dev-1.4.0.20230530-205658/LinkisInstall/logs/linkis-cli/ps_err_pid%p.log -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=80 -XX:+DisableExplicitGC    -classpath /data/Install/LinkisInstall/linkis-dev-1.4.0.20230530-205658/LinkisInstall/conf/linkis-cli:/data/Install/LinkisInstall/linkis-dev-1.4.0.20230530-205658/LinkisInstall/lib/linkis-computation-governance/linkis-client/linkis-cli/*:/data/Install/LinkisInstall/linkis-dev-1.4.0.20230530-205658/LinkisInstall/lib/linkis-commons/public-module/*:.:/data/apps/jdk1.8.0_192/lib/dt.jar:/data/apps/jdk1.8.0_192/lib/tools.jar -Dconf.root=/data/Install/LinkisInstall/linkis-dev-1.4.0.20230530-205658/LinkisInstall/conf/linkis-cli -Dconf.file=linkis-cli.properties -Dlog.path=/data/Inst
 all/LinkisInstall/linkis-dev-1.4.0.20230530-205658/LinkisInstall/logs/linkis-cli -Dlog.file=linkis-client.hadoop.log.20230602181002173313922  org.apache.linkis.cli.application.LinkisClientApplication '-submitUser hadoop -engineType hive-3.1.3 -codeType hql -code show tables'
   [INFO] LogFile path: /data/Install/LinkisInstall/linkis-dev-1.4.0.20230530-205658/LinkisInstall/logs/linkis-cli/linkis-client.hadoop.log.20230602181002173313922
   [INFO] User does not provide usr-configuration file. Will use default config
   [INFO] user does not specify proxy-user, will use current submit-user "hadoop" by default.
   [INFO] connecting to linkis gateway:http://127.0.0.1:9001
   JobId:9
   TaskId:9
   ExecId:exec_id018010linkis-cg-entrancewds05:9104LINKISCLI_hadoop_hive_0
   [INFO] Job is successfully submitted!
   
   2023-06-02 18:10:03.010 INFO Program is substituting variables for you
   2023-06-02 18:10:03.010 INFO Variables substitution ended successfully
   2023-06-02 18:10:03.010 WARN The code you submit will not be limited by the limit
   2023-06-02 18:10:03.010 INFO Job with jobId : 9 and execID : LINKISCLI_hadoop_hive_0 submitted
   2023-06-02 18:10:03.010 INFO Your job is Scheduled. Please wait it to run.
   2023-06-02 18:10:03.010 INFO You have submitted a new job, script code (after variable substitution) is
   ************************************SCRIPT CODE************************************
   show tables
   ************************************SCRIPT CODE************************************
   2023-06-02 18:10:03.010 INFO Your job is accepted,  jobID is LINKISCLI_hadoop_hive_0 and jobReqId is 9 in ServiceInstance(linkis-cg-entrance, wds05:9104). Please wait it to be scheduled
   2023-06-02 18:10:03.010 INFO job is scheduled.
   2023-06-02 18:10:03.010 INFO Your job is being scheduled by orchestrator.
   2023-06-02 18:10:03.010 INFO Your job is Running now. Please wait it to complete.
   2023-06-02 18:10:03.010 INFO job is running.
   2023-06-02 18:10:03.010 INFO JobRequest (9) was submitted to Orchestrator.
   2023-06-02 18:10:19.010 INFO Succeed to create new ec : ServiceInstance(linkis-cg-engineconn, wds05:39041)
   2023-06-02 18:10:19.010 INFO Task submit to ec: ServiceInstance(linkis-cg-engineconn, wds05:39041) get engineConnExecId is: 1
   2023-06-02 18:10:19.010 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, wds05:39041) /appcom/tmp/hadoop/20230602/hive/337850bb-b19f-4e95-b642-e2538b429f58/logs
   HiveEngineExecutor_0 >> show tables
   Time taken: 160 ms, begin to fetch results.
   Fetched  1 col(s) : 0 row(s) in hive
   2023-06-02 18:10:19.135 WARN  [Linkis-Default-Scheduler-Thread-2] org.apache.linkis.engineconn.computation.executor.hook.executor.ExecuteOnceHook 53 beforeExecutorExecute [JobId-9] - execute once become effective, register lock listener
   2023-06-02 18:10:19.657 WARN  [Linkis-Default-Scheduler-Thread-2] org.apache.linkis.engineplugin.hive.executor.HiveDriverProxy 97 $anonfun$tryAndWarn$1 [JobId-9] - java.lang.reflect.InvocationTargetException: null
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_192]
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_192]
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_192]
           at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_192]
           at org.apache.linkis.engineplugin.hive.executor.HiveDriverProxy.$anonfun$getResults$1(HiveEngineConnExecutor.scala:650) ~[linkis-engineplugin-hive-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) ~[scala-library-2.12.17.jar:?]
           at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:49) ~[linkis-common-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:85) ~[linkis-common-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineplugin.hive.executor.HiveDriverProxy.getResults(HiveEngineConnExecutor.scala:647) ~[linkis-engineplugin-hive-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.sendResultSet(HiveEngineConnExecutor.scala:322) ~[linkis-engineplugin-hive-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.org$apache$linkis$engineplugin$hive$executor$HiveEngineConnExecutor$$executeHQL(HiveEngineConnExecutor.scala:273) ~[linkis-engineplugin-hive-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anon$1.run(HiveEngineConnExecutor.scala:166) ~[linkis-engineplugin-hive-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anon$1.run(HiveEngineConnExecutor.scala:159) ~[linkis-engineplugin-hive-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_192]
           at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_192]
           at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878) ~[hadoop-common-3.3.4.jar:?]
           at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.executeLine(HiveEngineConnExecutor.scala:159) ~[linkis-engineplugin-hive-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$toExecuteTask$10(ComputationExecutor.scala:204) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:49) ~[linkis-common-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$toExecuteTask$9(ComputationExecutor.scala:204) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$toExecuteTask$9$adapted(ComputationExecutor.scala:196) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at scala.collection.immutable.Range.foreach(Range.scala:158) ~[scala-library-2.12.17.jar:?]
           at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$toExecuteTask$1(ComputationExecutor.scala:196) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:77) ~[linkis-common-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.toExecuteTask(ComputationExecutor.scala:249) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$execute$2(ComputationExecutor.scala:264) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:77) ~[linkis-common-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:62) ~[linkis-accessible-executor-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:56) ~[linkis-accessible-executor-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.ensureOp(ComputationExecutor.scala:142) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$execute$1(ComputationExecutor.scala:264) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:77) ~[linkis-common-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.execute(ComputationExecutor.scala:271) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl.$anonfun$executeTask$1(TaskExecutionServiceImpl.scala:403) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.17.jar:?]
           at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:77) ~[linkis-common-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl.org$apache$linkis$engineconn$computation$executor$service$TaskExecutionServiceImpl$$executeTask(TaskExecutionServiceImpl.scala:414) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2.$anonfun$run$3(TaskExecutionServiceImpl.scala:330) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.17.jar:?]
           at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:49) ~[linkis-common-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:85) ~[linkis-common-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2.run(TaskExecutionServiceImpl.scala:328) ~[linkis-computation-engineconn-1.4.0-SNAPSHOT.jar:1.4.0-SNAPSHOT]
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_192]
           at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_192]
           at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_192]
           at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) ~[?:1.8.0_192]
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_192]
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_192]
           at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_192]
   Caused by: java.io.IOException: java.lang.RuntimeException: Error in configuring object
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:602) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:509) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2691) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.reexec.ReExecDriver.getResults(ReExecDriver.java:229) ~[hive-exec-3.1.3.jar:3.1.3]
           ... 49 more
   Caused by: java.lang.RuntimeException: Error in configuring object
           at org.apache.hive.common.util.ReflectionUtil.setJobConf(ReflectionUtil.java:115) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hive.common.util.ReflectionUtil.setConf(ReflectionUtil.java:103) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hive.common.util.ReflectionUtil.newInstance(ReflectionUtil.java:87) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:221) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:381) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:314) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:540) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:509) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2691) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.reexec.ReExecDriver.getResults(ReExecDriver.java:229) ~[hive-exec-3.1.3.jar:3.1.3]
           ... 49 more
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_192]
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_192]
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_192]
           at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_192]
           at org.apache.hive.common.util.ReflectionUtil.setJobConf(ReflectionUtil.java:112) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hive.common.util.ReflectionUtil.setConf(ReflectionUtil.java:103) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hive.common.util.ReflectionUtil.newInstance(ReflectionUtil.java:87) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:221) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:381) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:314) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:540) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:509) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2691) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.reexec.ReExecDriver.getResults(ReExecDriver.java:229) ~[hive-exec-3.1.3.jar:3.1.3]
           ... 49 more
   Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.
           at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:139) ~[hadoop-common-3.3.4.jar:?]
           at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:180) ~[hadoop-common-3.3.4.jar:?]
           at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45) ~[hadoop-mapreduce-client-core-3.3.4.jar:?]
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_192]
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_192]
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_192]
           at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_192]
           at org.apache.hive.common.util.ReflectionUtil.setJobConf(ReflectionUtil.java:112) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hive.common.util.ReflectionUtil.setConf(ReflectionUtil.java:103) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hive.common.util.ReflectionUtil.newInstance(ReflectionUtil.java:87) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:221) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:381) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:314) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:540) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:509) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2691) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.reexec.ReExecDriver.getResults(ReExecDriver.java:229) ~[hive-exec-3.1.3.jar:3.1.3]
           ... 49 more
   Caused by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec not found
           at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2592) ~[hadoop-common-3.3.4.jar:?]
           at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:132) ~[hadoop-common-3.3.4.jar:?]
           at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:180) ~[hadoop-common-3.3.4.jar:?]
           at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45) ~[hadoop-mapreduce-client-core-3.3.4.jar:?]
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_192]
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_192]
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_192]
           at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_192]
           at org.apache.hive.common.util.ReflectionUtil.setJobConf(ReflectionUtil.java:112) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hive.common.util.ReflectionUtil.setConf(ReflectionUtil.java:103) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hive.common.util.ReflectionUtil.newInstance(ReflectionUtil.java:87) ~[hive-common-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:221) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:381) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:314) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:540) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:509) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2691) ~[hive-exec-3.1.3.jar:3.1.3]
           at org.apache.hadoop.hive.ql.reexec.ReExecDriver.getResults(ReExecDriver.java:229) ~[hive-exec-3.1.3.jar:3.1.3]
           ... 49 more
   2023-06-02 18:10:19.010 INFO Congratulations! Your job : LINKISCLI_hadoop_hive_0 executed with status succeed and 2 results.
   2023-06-02 18:10:19.010 INFO Task creation time(任务创建时间): 2023-06-02 18:10:03, Task scheduling time(任务调度时间): 2023-06-02 18:10:03, Task start time(任务开始时间): 2023-06-02 18:10:03, Mission end time(任务结束时间): 2023-06-02 18:10:19
   2023-06-02 18:10:19.010 INFO Task submit to Orchestrator time:2023-06-02 18:10:03, Task request EngineConn time:2023-06-02 18:10:03, Task submit to EngineConn time:2023-06-02 18:10:19
   2023-06-02 18:10:19.010 INFO Your mission(您的任务) 9 The total time spent is(总耗时时间为): 16.1 s
   2023-06-02 18:10:19.010 INFO Congratulations. Your job completed with status Success.
   2023-06-02 18:10:19.010 INFO job is completed.
   
   [INFO] Job execute successfully! Will try get execute result
   ============Result:================
   TaskId:9
   ExecId: exec_id018010linkis-cg-entrancewds05:9104LINKISCLI_hadoop_hive_0
   User:hadoop
   Current job status:SUCCEED
   extraMsg:
   result:
   
   [INFO] Retrieving result-set, may take time if result-set is large, please do not exit program.
   ============ RESULT SET 1 ============
   ----------- META DATA ------------
   dataType        comment columnName
   string  from deserializer       tab_name
   ------------ END OF META DATA ------------
   
   ############Execute Success!!!########
   
   ### Expected behavior
   
   任务能正常执行,无错误日志输出
   
   ### Your environment
   
   jdk 1.8
   linkis 1.4.0
   hive 3.1.3
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [ ] Yes I am willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@linkis.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@linkis.apache.org
For additional commands, e-mail: notifications-help@linkis.apache.org


[GitHub] [linkis] casionone closed issue #4596: [Bug] hadoop lzo-related error logs are generated after hive tasks are executed

Posted by "casionone (via GitHub)" <gi...@apache.org>.
casionone closed issue #4596: [Bug] hadoop lzo-related error logs are generated after hive tasks are executed
URL: https://github.com/apache/linkis/issues/4596


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@linkis.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@linkis.apache.org
For additional commands, e-mail: notifications-help@linkis.apache.org


[GitHub] [linkis] aiceflower commented on issue #4596: [Bug] hadoop lzo-related error logs are generated after hive tasks are executed

Posted by "aiceflower (via GitHub)" <gi...@apache.org>.
aiceflower commented on issue #4596:
URL: https://github.com/apache/linkis/issues/4596#issuecomment-1598278647

   remove jars associated with lzo in  $HADOOP_HOME/etc/hadoop/core-site.xml
   ![企业微信截图_16859517257917](https://github.com/apache/linkis/assets/22620332/2cc76ff8-c2b0-43e1-a1dc-883830ead11b)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@linkis.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@linkis.apache.org
For additional commands, e-mail: notifications-help@linkis.apache.org


[GitHub] [linkis] github-actions[bot] commented on issue #4596: [Bug] hadoop lzo-related error logs are generated after hive tasks are executed

Posted by "github-actions[bot] (via GitHub)" <gi...@apache.org>.
github-actions[bot] commented on issue #4596:
URL: https://github.com/apache/linkis/issues/4596#issuecomment-1573507176

   ## :blush:  Welcome to the Apache Linkis community!!
   We are glad that you are contributing by opening this issue.
   
   Please make sure to include all the relevant context.
   We will be here shortly.
   
   If you are interested in contributing to our website project, please let us know!
   You can check out our contributing guide on
    :point_right:  [How to Participate in Project Contribution](https://linkis.apache.org/community/how-to-contribute).
   
   
   ### Community
   
   |WeChat Assistant|WeChat Public Account|
   |-|-|
   |<img src="https://linkis.apache.org/Images/wedatasphere_contact_01.png" width="128"/>|<img src="https://linkis.apache.org/Images/gzh_01.png" width="128"/>|
   
   
   ### Mailing Lists
   |Name|Description|Subscribe|Unsubscribe|Archive|
   |:-----|:--------|:------|:-------|:-----|
   | [dev@linkis.apache.org](mailto:dev@linkis.apache.org) | community activity information | [subscribe](mailto:dev-subscribe@linkis.apache.org) | [unsubscribe](mailto:dev-unsubscribe@linkis.apache.org) | [archive](http://mail-archives.apache.org/mod_mbox/linkis-dev) |


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@linkis.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@linkis.apache.org
For additional commands, e-mail: notifications-help@linkis.apache.org