You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@seatunnel.apache.org by GitBox <gi...@apache.org> on 2022/04/02 06:30:06 UTC

[GitHub] [incubator-seatunnel] chenhu opened a new issue #1649: [Bug] Errors running on spark3.1.3 with kerberos

chenhu opened a new issue #1649:
URL: https://github.com/apache/incubator-seatunnel/issues/1649


   ### Search before asking
   
   - [X] I had searched in the [issues](https://github.com/apache/incubator-seatunnel/issues?q=is%3Aissue+label%3A%22bug%22) and found no similar issues.
   
   
   ### What happened
   
   When I run the seatunnel  example  on Spark3.1.3 with kerberos ,   errors happened.
   notes:
   1. I have kinit with the principal successes
   2. Running the example with tuple difference master , one is yarn and the other is local[2]
   
   
   ### SeaTunnel Version
   
   2.10
   
   ### SeaTunnel Config
   
   ```conf
   #####
   ##### the example file in path of ~/config/spark.batch.conf.template
   #####
   ```
   
   
   ### Running Command
   
   ```shell
   The first one:
   source /home/omm/clients/spark2x/bigdata_env
   kinit -kt /home/omm/clients/keytabs/les_task_user/user.keytab les_task_user@HADOOP.COM
   ./bin/start-seatunnel-spark.sh --master local[2] --deploy-mode client --config config/spark.batch.conf.template
   
   #######
   
   The second:
   source /home/omm/clients/spark2x/bigdata_env
   kinit -kt /home/omm/clients/keytabs/les_task_user/user.keytab les_task_user@HADOOP.COM
   ./bin/start-seatunnel-spark.sh --master local[2] --deploy-mode client --config config/spark.batch.conf.template
   ```
   
   
   ### Error Exception
   
   ```log
   [omm@CDJJ-CN01 seatunnel]$ source /home/omm/clients/spark2x/bigdata_env
   [omm@CDJJ-CN01 seatunnel]$ kinit -kt /home/omm/clients/keytabs/les_task_user/user.keytab les_task_user@HADOOP.COM
   [omm@CDJJ-CN01 seatunnel]$ ./bin/start-seatunnel-spark.sh --master local[2] --deploy-mode client --config config/spark.batch.conf.template
   
   [INFO] spark conf: --conf "spark.app.name=SeaTunnel" --conf "spark.executor.memory=1g" --conf "spark.executor.cores=1" --conf "spark.executor.instances=2"
   Warning: Ignoring non-Spark config property: "spark.executor.memory
   Warning: Ignoring non-Spark config property: "spark.app.name
   Warning: Ignoring non-Spark config property: "spark.executor.instances
   Warning: Ignoring non-Spark config property: "spark.executor.cores
   2022-04-02 20:01:33,791 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:33,795 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:33,796 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:33,796 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:33,918 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:33,919 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:33,919 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:33,919 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,955 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,955 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,956 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,956 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,986 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,987 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,987 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,987 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,989 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,989 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,990 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:34,990 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,037 | WARN  | main | Note that spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone/kubernetes and LOCAL_DIRS in YARN). | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,868 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,868 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,869 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,869 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,874 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,875 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,875 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,875 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,876 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,876 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,877 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:35,877 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:01:36,034 | WARN  | main | The jar file:/home/omm/seatunnel/lib/seatunnel-core-spark.jar has beenadded already. Overwriting of added jars is not supported in the current version. | org.apache.spark.SparkContext.logWarning(Logging.scala:69)
   2022-04-02 20:01:53,286 | ERROR | dispatcher-BlockManagerMaster | Ignoring error | org.apache.spark.rpc.netty.Inbox.logError(Logging.scala:94)
   java.lang.NullPointerException
           at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:524)
           at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:116)
           at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103)
           at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
           at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
           at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
           at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   2022-04-02 20:01:53,286 | WARN  | executor-heartbeater | Issue communicating with driver in heartbeater | org.apache.spark.executor.Executor.logWarning(Logging.scala:90)
   org.apache.spark.SparkException: Exception thrown in awaitResult:
           at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
           at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
           at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103)
           at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87)
           at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80)
           at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:591)
           at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1036)
           at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:213)
           at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
           at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2112)
           at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
           at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
           at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.NullPointerException
           at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:524)
           at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:116)
           at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103)
           at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
           at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
           at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
           at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           ... 3 more
   2022-04-02 20:02:03,249 | ERROR | dispatcher-BlockManagerMaster | Ignoring error | org.apache.spark.rpc.netty.Inbox.logError(Logging.scala:94)
   java.lang.NullPointerException
           at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:524)
           at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:116)
           at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103)
           at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
           at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
           at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
           at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   2022-04-02 20:02:03,249 | WARN  | executor-heartbeater | Issue communicating with driver in heartbeater | org.apache.spark.executor.Executor.logWarning(Logging.scala:90)
   org.apache.spark.SparkException: Exception thrown in awaitResult:
           at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
           at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
           at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103)
           at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87)
           at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80)
           at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:591)
           at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1036)
           at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:213)
           at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
           at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2112)
           at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
           at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
           at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.NullPointerException
           at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:524)
           at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:116)
           at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103)
           at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
           at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
           at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
           at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           ... 3 more
   2022-04-02 20:02:13,248 | WARN  | executor-heartbeater | Issue communicating with driver in heartbeater | org.apache.spark.executor.Executor.logWarning(Logging.scala:90)
   org.apache.spark.SparkException: Exception thrown in awaitResult:
           at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
           at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
           at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103)
           at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87)
           at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80)
           at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:591)
           at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1036)
           at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:213)
           at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
           at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2112)
           at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46)
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
           at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
           at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.NullPointerException
           at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:524)
           at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:116)
           at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103)
           at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
           at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
           at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
           at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           ... 3 more
   2022-04-02 20:02:13,248 | ERROR | dispatcher-BlockManagerMaster | Ignoring error | org.apache.spark.rpc.netty.Inbox.logError(Logging.scala:94)
   java.lang.NullPointerException
           at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:524)
           at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:116)
           at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103)
           at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
           at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
           at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
           at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
           at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   2022-04-02 20:02:16,728 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:02:16,729 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   Hive Session ID = 81da1997-ec51-4a96-92e6-27724527f440
   2022-04-02 20:02:17,060 | WARN  | main | load mapred-default.xml, HIVE_CONF_DIR env not found! | org.apache.hadoop.hive.ql.session.SessionState.loadMapredDefaultXml(SessionState.java:1461)
   2022-04-02 20:02:17,110 | WARN  | main | METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. | org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:1046)
   2022-04-02 20:02:18,020 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:02:18,023 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:02:18,027 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:02:18,028 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:02:18,029 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:02:18,030 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:02:18,662 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:02:18,663 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:02:19,790 | WARN  | main | Error when load plugin: [org.apache.seatunnel.spark.sink.Console] | org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:147)
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Kafka could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1259)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:101)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:65)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:993)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:183)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:206)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:93)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1081)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1090)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Kafka.<init>(Kafka.scala:31)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:402)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           ... 28 more
   2022-04-02 20:02:19,797 | WARN  | main | Error when load plugin: [org.apache.seatunnel.spark.sink.Console] | org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:147)
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Hive could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1259)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:101)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:65)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:993)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:183)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:206)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:93)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1081)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1090)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Hive.<init>(Hive.scala:29)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:402)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           ... 28 more
   2022-04-02 20:02:19,804 | WARN  | main | Error when load plugin: [org.apache.seatunnel.spark.sink.Console] | org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:147)
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Phoenix could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1259)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:101)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:65)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:993)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:183)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:206)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:93)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1081)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1090)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Phoenix.<init>(Phoenix.scala:29)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:402)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           ... 28 more
   2022-04-02 20:02:19,812 | WARN  | main | Error when load plugin: [org.apache.seatunnel.spark.sink.Console] | org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:147)
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Redis could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1259)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:101)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:65)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:993)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:183)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:206)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:93)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1081)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1090)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Redis.<init>(Redis.scala:33)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:402)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           ... 28 more
   2022-04-02 20:02:20,984 | WARN  | main | The enable mv value "null" is invalid. Using the default value "false" | org.apache.carbondata.core.util.CarbonProperties.validateEnableMV(CarbonProperties.java:511)
   2022-04-02 20:02:20,998 | WARN  | main | The value "LOCALLOCK" configured for key carbon.lock.type is invalid for current file system. Use the default value HDFSLOCK instead. | org.apache.carbondata.core.util.CarbonProperties.validateAndConfigureLockType(CarbonProperties.java:440)
   +------------------+
   |raw_message       |
   +------------------+
   |Hello garyelephant|
   |Hello rickyhuo    |
   |Hello kid-xiong   |
   +------------------+
   ```
   
   
   ### Flink or Spark Version
   
   spark3.1.3
   
   ### Java or Scala Version
   
   java8
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-seatunnel] chenhu commented on issue #1649: [Bug] Errors running on spark3.1.3 with kerberos

Posted by GitBox <gi...@apache.org>.
chenhu commented on issue #1649:
URL: https://github.com/apache/incubator-seatunnel/issues/1649#issuecomment-1086564190


   the exception with master yarn :
   
   [omm@CDJJ-CN01 seatunnel]$ source /home/omm/clients/spark2x/bigdata_env
   [omm@CDJJ-CN01 seatunnel]$ kinit -kt /home/omm/clients/keytabs/les_task_user/user.keytab les_task_user@HADOOP.COM
   [omm@CDJJ-CN01 seatunnel]$ ./bin/start-seatunnel-spark.sh --master yarn --deploy-mode client --config config/spark.batch.conf.template
   [INFO] spark conf: --conf "spark.app.name=SeaTunnel" --conf "spark.executor.memory=1g" --conf "spark.executor.cores=1" --conf "spark.executor.instances=2"
   Warning: Ignoring non-Spark config property: "spark.executor.memory
   Warning: Ignoring non-Spark config property: "spark.app.name
   Warning: Ignoring non-Spark config property: "spark.executor.instances
   Warning: Ignoring non-Spark config property: "spark.executor.cores
   2022-04-02 20:12:22,398 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:22,402 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:22,403 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:22,403 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:22,511 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:22,512 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:22,512 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:22,512 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,525 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,525 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,526 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,526 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,569 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,569 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,569 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,569 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,571 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,571 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,572 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,572 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:23,620 | WARN  | main | Note that spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone/kubernetes and LOCAL_DIRS in YARN). | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,530 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,531 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,531 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,531 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,536 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,537 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,537 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,537 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,541 | WARN  | main | The configuration key 'spark.executor.plugins' has been deprecated as of Spark 3.0.0 and may be removed in the future. Feature replaced with new plugin API. See Monitoring documentation. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,542 | WARN  | main | The configuration key 'spark.reducer.maxReqSizeShuffleToMem' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.network.maxRemoteBlockSizeFetchToMem' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,542 | WARN  | main | The configuration key 'spark.yarn.kerberos.relogin.period' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.relogin.period' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,543 | WARN  | main | The configuration key 'spark.yarn.access.hadoopFileSystems' has been deprecated as of Spark 3.0 and may be removed in the future. Please use the new key 'spark.kerberos.access.hadoopFileSystems' instead. | org.apache.spark.SparkConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:24,865 | WARN  | main | spark.yarn.security.credentials.hbase.enabled is deprecated.  Please use spark.security.credentials.hbase.enabled instead. | org.apache.spark.deploy.security.HadoopDelegationTokenManager.logWarning(Logging.scala:69)
   2022-04-02 20:12:26,484 | WARN  | main | Got unknown resource type: yarn.io/gpu; skipping | org.apache.hadoop.yarn.api.records.impl.pb.ResourcePBImpl.initResources(ResourcePBImpl.java:145)
   2022-04-02 20:12:42,937 | WARN  | dispatcher-event-loop-2 | Requesting driver to remove executor 2 for reason Container from a bad node: container_e01_1648207397321_0039_01_000002 on host: CDJJ-HDFS-CH-Hbase-Yarn02. Exit status: 1. Diagnostics: [2022-04-02 20:12:41.134]Exception from container-launch.
   Container id: container_e01_1648207397321_0039_01_000002
   Exit code: 1
   Exception message: Launch container failed
   Shell output: main : command provided 1
   main : run as user is les_task_user
   main : requested yarn user is les_task_user
   Getting exit code file...
   Creating script paths...
   Writing pid file...
   Writing to tmp file /srv/BigData/hadoop/data2/nm/localdir/nmPrivate/application_1648207397321_0039/container_e01_1648207397321_0039_01_000002/container_e01_1648207397321_0039_01_000002.pid.tmp
   Writing to cgroup task files...
   Creating local dirs...
   Launching container...
   
   
   [2022-04-02 20:12:41.138]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   [2022-04-02 20:12:41.139]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   . | org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint.logWarning(Logging.scala:69)
   2022-04-02 20:12:42,941 | WARN  | dispatcher-event-loop-3 | Requesting driver to remove executor 1 for reason Container from a bad node: container_e01_1648207397321_0039_01_000003 on host: CDJJ-HDFS-CH-Hbase-Yarn07. Exit status: 1. Diagnostics: [2022-04-02 20:12:40.912]Exception from container-launch.
   Container id: container_e01_1648207397321_0039_01_000003
   Exit code: 1
   Exception message: Launch container failed
   Shell output: main : command provided 1
   main : run as user is les_task_user
   main : requested yarn user is les_task_user
   Getting exit code file...
   Creating script paths...
   Writing pid file...
   Writing to tmp file /srv/BigData/hadoop/data1/nm/localdir/nmPrivate/application_1648207397321_0039/container_e01_1648207397321_0039_01_000003/container_e01_1648207397321_0039_01_000003.pid.tmp
   Writing to cgroup task files...
   Creating local dirs...
   Launching container...
   
   
   [2022-04-02 20:12:40.917]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   [2022-04-02 20:12:40.918]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   . | org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint.logWarning(Logging.scala:69)
   2022-04-02 20:12:49,178 | WARN  | dispatcher-event-loop-44 | Requesting driver to remove executor 4 for reason Container from a bad node: container_e01_1648207397321_0039_01_000004 on host: CDJJ-HDFS-CH-Hbase-Yarn07. Exit status: 1. Diagnostics: [2022-04-02 20:12:47.712]Exception from container-launch.
   Container id: container_e01_1648207397321_0039_01_000004
   Exit code: 1
   Exception message: Launch container failed
   Shell output: main : command provided 1
   main : run as user is les_task_user
   main : requested yarn user is les_task_user
   Getting exit code file...
   Creating script paths...
   Writing pid file...
   Writing to tmp file /srv/BigData/hadoop/data1/nm/localdir/nmPrivate/application_1648207397321_0039/container_e01_1648207397321_0039_01_000004/container_e01_1648207397321_0039_01_000004.pid.tmp
   Writing to cgroup task files...
   Creating local dirs...
   Launching container...
   
   
   [2022-04-02 20:12:47.716]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   [2022-04-02 20:12:47.718]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   . | org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint.logWarning(Logging.scala:69)
   2022-04-02 20:12:52,196 | WARN  | dispatcher-event-loop-32 | Requesting driver to remove executor 3 for reason Container from a bad node: container_e01_1648207397321_0039_01_000005 on host: CDJJ-HDFS-CH-Hbase-Yarn04. Exit status: 1. Diagnostics: [2022-04-02 20:12:50.116]Exception from container-launch.
   Container id: container_e01_1648207397321_0039_01_000005
   Exit code: 1
   Exception message: Launch container failed
   Shell output: main : command provided 1
   main : run as user is les_task_user
   main : requested yarn user is les_task_user
   Getting exit code file...
   Creating script paths...
   Writing pid file...
   Writing to tmp file /srv/BigData/hadoop/data2/nm/localdir/nmPrivate/application_1648207397321_0039/container_e01_1648207397321_0039_01_000005/container_e01_1648207397321_0039_01_000005.pid.tmp
   Writing to cgroup task files...
   Creating local dirs...
   Launching container...
   
   
   [2022-04-02 20:12:50.120]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   [2022-04-02 20:12:50.122]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   . | org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint.logWarning(Logging.scala:69)
   2022-04-02 20:12:54,937 | WARN  | spark-listener-group-shared | fail in fi SparkAMRegistedListener->onApplicationStart, JobHistory will not aggregate the AM log of this application | org.apache.spark.fi.listeners.SparkAMRegistedListener.logWarning(Logging.scala:69)
   2022-04-02 20:12:56,392 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:56,394 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   Hive Session ID = e9973811-6745-44d7-9c17-a3ca5676ca60
   2022-04-02 20:12:56,559 | WARN  | main | load mapred-default.xml, HIVE_CONF_DIR env not found! | org.apache.hadoop.hive.ql.session.SessionState.loadMapredDefaultXml(SessionState.java:1461)
   2022-04-02 20:12:56,604 | WARN  | main | METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. | org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:1046)
   2022-04-02 20:12:57,767 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:57,770 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:57,772 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:57,773 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:57,774 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:57,775 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.internal.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:58,254 | WARN  | main | The SQL config 'spark.sql.hive.verifyPartitionPath' has been deprecated in Spark v3.0 and may be removed in the future. This config is replaced by 'spark.files.ignoreMissingFiles'. | org.apache.spark.sql.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:58,255 | WARN  | main | The SQL config 'spark.sql.execution.arrow.fallback.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.fallback.enabled' instead of it. | org.apache.spark.sql.SQLConf.logWarning(Logging.scala:69)
   2022-04-02 20:12:59,383 | WARN  | main | Error when load plugin: [org.apache.seatunnel.spark.sink.Console] | org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:147)
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Kafka could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1259)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:101)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:65)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:993)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:183)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:206)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:93)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1081)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1090)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Kafka.<init>(Kafka.scala:31)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:402)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           ... 28 more
   2022-04-02 20:12:59,395 | WARN  | main | Error when load plugin: [org.apache.seatunnel.spark.sink.Console] | org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:147)
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Hive could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1259)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:101)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:65)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:993)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:183)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:206)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:93)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1081)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1090)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Hive.<init>(Hive.scala:29)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:402)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           ... 28 more
   2022-04-02 20:12:59,400 | WARN  | main | Error when load plugin: [org.apache.seatunnel.spark.sink.Console] | org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:147)
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Phoenix could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1259)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:101)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:65)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:993)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:183)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:206)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:93)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1081)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1090)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Phoenix.<init>(Phoenix.scala:29)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:402)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           ... 28 more
   2022-04-02 20:12:59,405 | WARN  | main | Error when load plugin: [org.apache.seatunnel.spark.sink.Console] | org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:147)
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Redis could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1259)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:101)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:65)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:993)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:183)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:206)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:93)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1081)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1090)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Redis.<init>(Redis.scala:33)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:402)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           ... 28 more
   2022-04-02 20:13:00,265 | WARN  | dispatcher-event-loop-56 | Requesting driver to remove executor 1 for reason Container from a bad node: container_e01_1648207397321_0039_02_000002 on host: CDJJ-HDFS-CH-Hbase-Yarn02. Exit status: 1. Diagnostics: [2022-04-02 20:12:58.847]Exception from container-launch.
   Container id: container_e01_1648207397321_0039_02_000002
   Exit code: 1
   Exception message: Launch container failed
   Shell output: main : command provided 1
   main : run as user is les_task_user
   main : requested yarn user is les_task_user
   Getting exit code file...
   Creating script paths...
   Writing pid file...
   Writing to tmp file /srv/BigData/hadoop/data1/nm/localdir/nmPrivate/application_1648207397321_0039/container_e01_1648207397321_0039_02_000002/container_e01_1648207397321_0039_02_000002.pid.tmp
   Writing to cgroup task files...
   Creating local dirs...
   Launching container...
   
   
   [2022-04-02 20:12:58.850]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   [2022-04-02 20:12:58.853]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
   Last 4096 bytes of prelaunch.err :
   Last 4096 bytes of stderr :
   g.apache.log4j.FileAppender.setFile(FileAppender.java:294)
           at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
           at org.wcc.framework.log.SizeRollingFileAppender.setFile(SizeRollingFileAppender.java:65)
           at com.huawei.spark.utils.SizeRollingZipFileATime4AuditAppender.setFile(SizeRollingZipFileATime4AuditAppender.java:19)
           at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
           at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
           at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
           at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
           at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
           at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
           at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
           at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
           at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
           at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
           at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
           at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
           at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
           at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
           at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
           at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388)
           at com.huawei.hadoop.dynalogger.DynaLog4jWatcher.<clinit>(DynaLog4jWatcher.java:37)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<init>(CoarseGrainedExecutorBackend.scala:360)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.<clinit>(CoarseGrainedExecutorBackend.scala)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:79)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realm
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
           at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
           at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
           at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:50)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:413)
           at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:434)
           at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:396)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend$.main(YarnCoarseGrainedExecutorBackend.scala:81)
           at org.apache.spark.executor.YarnCoarseGrainedExecutorBackend.main(YarnCoarseGrainedExecutorBackend.scala)
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
           at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
           ... 9 more
   Caused by: KrbException: Cannot locate default realm
           at sun.security.krb5.Config.getDefaultRealm(Config.java:1137)
           ... 15 more
   
   
   . | org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint.logWarning(Logging.scala:69)
   2022-04-02 20:13:00,561 | WARN  | main | The enable mv value "null" is invalid. Using the default value "false" | org.apache.carbondata.core.util.CarbonProperties.validateEnableMV(CarbonProperties.java:511)
   2022-04-02 20:13:00,574 | WARN  | main | The value "LOCALLOCK" configured for key carbon.lock.type is invalid for current file system. Use the default value HDFSLOCK instead. | org.apache.carbondata.core.util.CarbonProperties.validateAndConfigureLockType(CarbonProperties.java:440)
   +------------------+
   |raw_message       |
   +------------------+
   |Hello garyelephant|
   |Hello rickyhuo    |
   |Hello kid-xiong   |
   +------------------+
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-seatunnel] CalvinKirs commented on issue #1649: [Bug] Errors running on spark3.1.3 with kerberos

Posted by GitBox <gi...@apache.org>.
CalvinKirs commented on issue #1649:
URL: https://github.com/apache/incubator-seatunnel/issues/1649#issuecomment-1086568133


   hi, we don't support Spark3 for now, the discussion about Spark3 is here #875 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org