You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kyuubi.apache.org by GitBox <gi...@apache.org> on 2022/02/17 08:35:12 UTC

[GitHub] [incubator-kyuubi] wForget opened a new issue #1924: [Bug] SparkContext stopped abnormally, but the KyuubiEngine did not stop.

wForget opened a new issue #1924:
URL: https://github.com/apache/incubator-kyuubi/issues/1924


   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
   
   
   ### Search before asking
   
   - [X] I have searched in the [issues](https://github.com/apache/incubator-kyuubi/issues?q=is%3Aissue) and found no similar issues.
   
   
   ### Describe the bug
   
   `SparkContext` stopped abnormally, but the `KyuubiEngine` did not stop.
   
   The subsequent requests will consistently report this error:
   ```
   22/02/17 14:45:56 ERROR SparkThriftBinaryFrontendService: Error opening session: 
   org.apache.kyuubi.KyuubiSQLException: Cannot call methods on a stopped SparkContext.
   This stopped SparkContext was created at:
   
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:940)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:102)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:154)
   org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   java.lang.reflect.Method.invoke(Method.java:498)
   org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
   
   The currently active SparkContext was created at:
   
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:940)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:102)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:154)
   org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   java.lang.reflect.Method.invoke(Method.java:498)
   org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
            
   	at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
   	at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:73)
   	at org.apache.kyuubi.engine.spark.session.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:96)
   	at org.apache.kyuubi.service.AbstractBackendService.openSession(AbstractBackendService.scala:45)
   	at org.apache.kyuubi.service.ThriftBinaryFrontendService.getSessionHandle(ThriftBinaryFrontendService.scala:199)
   	at org.apache.kyuubi.engine.spark.SparkThriftBinaryFrontendService.OpenSession(SparkThriftBinaryFrontendService.scala:75)
   	at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1497)
   	at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1482)
   	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38)
   	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
   	at org.apache.kyuubi.service.authentication.TSetIpAddressProcessor.process(TSetIpAddressProcessor.scala:36)
   	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:310)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
   	at java.lang.Thread.run(Thread.java:745)
   Caused by: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
   This stopped SparkContext was created at:
   
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:940)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:102)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:154)
   org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   java.lang.reflect.Method.invoke(Method.java:498)
   org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
   
   The currently active SparkContext was created at:
   
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:940)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:102)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:154)
   org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   java.lang.reflect.Method.invoke(Method.java:498)
   org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
            
   	at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:118)
   	at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:105)
   	at org.apache.spark.sql.SparkSession.newSession(SparkSession.scala:248)
   	at org.apache.kyuubi.engine.spark.session.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:71)
   	... 12 more
   ```
   
   ### Affects Version(s)
   
   1.4.0
   
   ### Kyuubi Server Log Output
   
   _No response_
   
   ### Kyuubi Engine Log Output
   
   ```logtalk
   22/02/17 05:25:38 ERROR Utils: uncaught error in thread spark-listener-group-shared, stopping SparkContext
   java.lang.OutOfMemoryError: GC overhead limit exceeded
   	at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:115)
   	at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101)
   	at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105)
   	at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105)
   	at org.apache.spark.scheduler.AsyncEventQueue$$Lambda$786/1199632447.apply$mcJ$sp(Unknown Source)
   	at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
   	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
   	at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
   	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
   	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2$$Lambda$785/1136622736.apply$mcV$sp(Unknown Source)
   	at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1381)
   	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
   22/02/17 05:25:38 INFO MemoryStore: Block broadcast_81928 stored as values in memory (estimated size 189.1 KiB, free 3.3 MiB)
   22/02/17 05:25:38 INFO BaseAllMetadataTableScan: Scanning metadata table iceberg_catalog.venus_log_iceberg.log_qijian_rec_service_19_prod_index with filter true.
   22/02/17 05:25:38 INFO MemoryStore: Block broadcast_81927 stored as values in memory (estimated size 189.2 KiB, free 3.1 MiB)
   22/02/17 05:25:38 INFO SparkContext: Created broadcast 81925 from broadcast at SparkBatchScan.java:141
   22/02/17 05:25:38 INFO MemoryStore: Block broadcast_81926 stored as values in memory (estimated size 189.1 KiB, free 2.9 MiB)
   22/02/17 05:25:38 INFO BlockManagerInfo: Removed broadcast_81807_piece0 on node63-66-75-bjzyx.qiyi.hadoop:21049 in memory (size: 27.6 KiB, free: 5.9 GiB)
   22/02/17 05:25:38 INFO MemoryStore: Block broadcast_81929 stored as values in memory (estimated size 189.3 KiB, free 3.0 MiB)
   22/02/17 05:25:38 INFO MemoryStore: Block broadcast_81928_piece0 stored as bytes in memory (estimated size 27.4 KiB, free 2.9 MiB)
   22/02/17 05:25:38 INFO MemoryStore: Block broadcast_81927_piece0 stored as bytes in memory (estimated size 27.5 KiB, free 2.9 MiB)
   22/02/17 05:25:38 INFO BlockManagerInfo: Added broadcast_81928_piece0 in memory on node63-66-75-bjzyx.qiyi.hadoop:21049 (size: 27.4 KiB, free: 5.9 GiB)
   22/02/17 05:25:38 INFO BlockManagerInfo: Added broadcast_81927_piece0 in memory on node63-66-75-bjzyx.qiyi.hadoop:21049 (size: 27.5 KiB, free: 5.9 GiB)
   22/02/17 05:25:38 INFO AsyncEventQueue: Process of event SparkListenerTaskEnd(36426,0,ResultTask,Success,org.apache.spark.scheduler.TaskInfo@714e7ab2,org.apache.spark.executor.ExecutorMetrics@7d4ff150,org.apache.spark.executor.TaskMetrics@8a4bd49) by listener ExecutorAllocationListener took 19.560030747s.
   22/02/17 05:25:38 INFO SparkContext: Created broadcast 81928 from broadcast at SparkBatchScan.java:141
   22/02/17 05:25:38 INFO SparkContext: Created broadcast 81927 from broadcast at SparkBatchScan.java:141
   22/02/17 05:25:38 INFO MemoryStore: Block broadcast_81926_piece0 stored as bytes in memory (estimated size 27.4 KiB, free 2.9 MiB)
   22/02/17 05:25:49 INFO BlockManagerInfo: Added broadcast_81926_piece0 in memory on node63-66-75-bjzyx.qiyi.hadoop:21049 (size: 27.4 KiB, free: 5.9 GiB)
   22/02/17 05:25:49 INFO MemoryStore: Block broadcast_81930 stored as values in memory (estimated size 189.1 KiB, free 2.7 MiB)
   22/02/17 05:25:49 INFO SparkContext: Created broadcast 81926 from broadcast at SparkBatchScan.java:141
   22/02/17 05:25:49 INFO MemoryStore: Block broadcast_81929_piece0 stored as bytes in memory (estimated size 27.5 KiB, free 2.7 MiB)
   22/02/17 05:25:49 ERROR Utils: throw uncaught fatal error in thread spark-listener-group-shared
   java.lang.OutOfMemoryError: GC overhead limit exceeded
   	at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:115)
   	at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101)
   	at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105)
   	at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105)
   	at org.apache.spark.scheduler.AsyncEventQueue$$Lambda$786/1199632447.apply$mcJ$sp(Unknown Source)
   	at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
   	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
   	at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
   	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
   	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2$$Lambda$785/1136622736.apply$mcV$sp(Unknown Source)
   	at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1381)
   	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
   Exception in thread "spark-listener-group-shared" java.lang.OutOfMemoryError: GC overhead limit exceeded
   	at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:115)
   	at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101)
   	at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105)
   	at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105)
   	at org.apache.spark.scheduler.AsyncEventQueue$$Lambda$786/1199632447.apply$mcJ$sp(Unknown Source)
   	at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
   	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
   	at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
   	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
   	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2$$Lambda$785/1136622736.apply$mcV$sp(Unknown Source)
   	at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1381)
   	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
   22/02/17 05:25:49 INFO AsyncEventQueue: Process of event SparkListenerTaskStart(36428,0,org.apache.spark.scheduler.TaskInfo@1512044a) by listener ExecutorAllocationListener took 10.76045231s.
   22/02/17 05:25:49 INFO BlockManagerInfo: Added broadcast_81929_piece0 in memory on node63-66-75-bjzyx.qiyi.hadoop:21049 (size: 27.5 KiB, free: 5.9 GiB)
   22/02/17 05:25:49 INFO SparkContext: Created broadcast 81929 from broadcast at SparkBatchScan.java:141
   22/02/17 05:25:49 INFO ExecuteStatement: Processing venus's query[f1179070-6f49-4fa6-885f-6731891ae2bd]: RUNNING_STATE -> ERROR_STATE, statement: CALL ** time taken: 369.671 seconds
   22/02/17 05:25:49 ERROR ExecuteStatement: Error operating EXECUTE_STATEMENT: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
   This stopped SparkContext was created at:
   
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:940)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:102)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:154)
   org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   java.lang.reflect.Method.invoke(Method.java:498)
   org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
   
   The currently active SparkContext was created at:
   
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:940)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:102)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:154)
   org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   java.lang.reflect.Method.invoke(Method.java:498)
   org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
            
   	at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:118)
   	at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1506)
   	at org.apache.spark.api.java.JavaSparkContext.broadcast(JavaSparkContext.scala:546)
   	at org.apache.iceberg.spark.source.SparkBatchScan.planInputPartitions(SparkBatchScan.java:141)
   	at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.partitions$lzycompute(BatchScanExec.scala:43)
   	at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.partitions(BatchScanExec.scala:43)
   	at org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar(DataSourceV2ScanExecBase.scala:87)
   	at org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar$(DataSourceV2ScanExecBase.scala:86)
   	at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.supportsColumnar(BatchScanExec.scala:29)
   	at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy.apply(DataSourceV2Strategy.scala:113)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$1(QueryPlanner.scala:63)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.execution.QueryExecution$.createSparkPlan(QueryExecution.scala:391)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$sparkPlan$1(QueryExecution.scala:104)
   	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143)
   	at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:104)
   	at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:97)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:117)
   	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143)
   	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:117)
   	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:110)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$simpleString$2(QueryExecution.scala:161)
   	at org.apache.spark.sql.execution.ExplainUtils$.processPlan(ExplainUtils.scala:115)
   	at org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:161)
   	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:206)
   	at org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:175)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:98)
   	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
   	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685)
   	at org.apache.spark.sql.Dataset.collectAsList(Dataset.scala:2976)
   	at org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction.doExecute(BaseDeleteOrphanFilesSparkAction.java:166)
   	at org.apache.iceberg.spark.actions.BaseSparkAction.withJobGroupInfo(BaseSparkAction.java:101)
   	at org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction.execute(BaseDeleteOrphanFilesSparkAction.java:141)
   	at org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction.execute(BaseDeleteOrphanFilesSparkAction.java:76)
   	at org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure.lambda$call$1(RemoveOrphanFilesProcedure.java:105)
   	at org.apache.iceberg.spark.procedures.BaseProcedure.execute(BaseProcedure.java:85)
   	at org.apache.iceberg.spark.procedures.BaseProcedure.withIcebergTable(BaseProcedure.java:78)
   	at org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure.call(RemoveOrphanFilesProcedure.java:86)
   	at org.apache.spark.sql.execution.datasources.v2.CallExec.run(CallExec.scala:33)
   	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:40)
   	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:40)
   	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:46)
   	at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228)
   	at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3687)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
   	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
   	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685)
   	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228)
   	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
   	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:615)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:610)
   	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:100)
   	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
   	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.withLocalProperties(ExecuteStatement.scala:159)
   	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.org$apache$kyuubi$engine$spark$operation$ExecuteStatement$$executeStatement(ExecuteStatement.scala:94)
   	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement$$anon$1.run(ExecuteStatement.scala:127)
   	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
   	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
   	at java.lang.Thread.run(Thread.java:745)
   
   java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
   This stopped SparkContext was created at:
   
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:940)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:102)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:154)
   org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   java.lang.reflect.Method.invoke(Method.java:498)
   org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
   
   The currently active SparkContext was created at:
   
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:940)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:102)
   org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:154)
   org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   java.lang.reflect.Method.invoke(Method.java:498)
   org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
            
   	at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:118)
   	at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1506)
   	at org.apache.spark.api.java.JavaSparkContext.broadcast(JavaSparkContext.scala:546)
   	at org.apache.iceberg.spark.source.SparkBatchScan.planInputPartitions(SparkBatchScan.java:141)
   	at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.partitions$lzycompute(BatchScanExec.scala:43)
   	at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.partitions(BatchScanExec.scala:43)
   	at org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar(DataSourceV2ScanExecBase.scala:87)
   	at org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar$(DataSourceV2ScanExecBase.scala:86)
   	at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.supportsColumnar(BatchScanExec.scala:29)
   	at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy.apply(DataSourceV2Strategy.scala:113)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$1(QueryPlanner.scala:63)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   	at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   	at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   	at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:66)
   	at org.apache.spark.sql.execution.QueryExecution$.createSparkPlan(QueryExecution.scala:391)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$sparkPlan$1(QueryExecution.scala:104)
   	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143)
   	at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:104)
   	at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:97)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:117)
   	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143)
   	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:117)
   	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:110)
   	at org.apache.spark.sql.execution.QueryExecution.$anonfun$simpleString$2(QueryExecution.scala:161)
   	at org.apache.spark.sql.execution.ExplainUtils$.processPlan(ExplainUtils.scala:115)
   	at org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:161)
   	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:206)
   	at org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:175)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:98)
   	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
   	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685)
   	at org.apache.spark.sql.Dataset.collectAsList(Dataset.scala:2976)
   	at org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction.doExecute(BaseDeleteOrphanFilesSparkAction.java:166)
   	at org.apache.iceberg.spark.actions.BaseSparkAction.withJobGroupInfo(BaseSparkAction.java:101)
   	at org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction.execute(BaseDeleteOrphanFilesSparkAction.java:141)
   	at org.apache.iceberg.spark.actions.BaseDeleteOrphanFilesSparkAction.execute(BaseDeleteOrphanFilesSparkAction.java:76)
   	at org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure.lambda$call$1(RemoveOrphanFilesProcedure.java:105)
   	at org.apache.iceberg.spark.procedures.BaseProcedure.execute(BaseProcedure.java:85)
   	at org.apache.iceberg.spark.procedures.BaseProcedure.withIcebergTable(BaseProcedure.java:78)
   	at org.apache.iceberg.spark.procedures.RemoveOrphanFilesProcedure.call(RemoveOrphanFilesProcedure.java:86)
   	at org.apache.spark.sql.execution.datasources.v2.CallExec.run(CallExec.scala:33)
   	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:40)
   	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:40)
   	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:46)
   	at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228)
   	at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3687)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
   	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
   	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685)
   	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228)
   	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
   	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:615)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:610)
   	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:100)
   	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
   	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.withLocalProperties(ExecuteStatement.scala:159)
   	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.org$apache$kyuubi$engine$spark$operation$ExecuteStatement$$executeStatement(ExecuteStatement.scala:94)
   	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement$$anon$1.run(ExecuteStatement.scala:127)
   	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
   	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
   	at java.lang.Thread.run(Thread.java:745)
   ```
   
   
   ### Kyuubi Server Configurations
   
   _No response_
   
   ### Kyuubi Engine Configurations
   
   _No response_
   
   ### Additional context
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-kyuubi] ulysses-you commented on issue #1924: [Bug] SparkContext stopped abnormally, but the KyuubiEngine did not stop.

Posted by GitBox <gi...@apache.org>.
ulysses-you commented on issue #1924:
URL: https://github.com/apache/incubator-kyuubi/issues/1924#issuecomment-1042698130


   @wForget have you tried the latest version 1.4.1 or master branch ? see issue https://github.com/apache/incubator-kyuubi/issues/1800


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-kyuubi] wForget closed issue #1924: [Bug] SparkContext stopped abnormally, but the KyuubiEngine did not stop.

Posted by GitBox <gi...@apache.org>.
wForget closed issue #1924:
URL: https://github.com/apache/incubator-kyuubi/issues/1924


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-kyuubi] wForget commented on issue #1924: [Bug] SparkContext stopped abnormally, but the KyuubiEngine did not stop.

Posted by GitBox <gi...@apache.org>.
wForget commented on issue #1924:
URL: https://github.com/apache/incubator-kyuubi/issues/1924#issuecomment-1042706994


   > @wForget have you tried the latest version 1.4.1 or master branch ? see issue #1800
   
   Great, thanks, i will try it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org