You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@seatunnel.apache.org by GitBox <gi...@apache.org> on 2022/02/18 04:11:56 UTC

[GitHub] [incubator-seatunnel] jiefzz opened a new issue #1284: [Bug] [seatunnel-core-spark] ClassNotFoundException throw while run with spark3.0.3

jiefzz opened a new issue #1284:
URL: https://github.com/apache/incubator-seatunnel/issues/1284


   ### Search before asking
   
   - [X] I had searched in the [issues](https://github.com/apache/incubator-seatunnel/issues?q=is%3Aissue+label%3A%22bug%22) and found no similar issues.
   
   
   ### What happened
   
   
   
   ```
   2022-02-17 17:38:50,472 INFO cluster.YarnClientSchedulerBackend: Application application_1644485376937_0125 has started running.
   2022-02-17 17:38:50,481 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 12710.
   2022-02-17 17:38:50,481 INFO netty.NettyBlockTransferService: Server created on 10.199.142.10:12710
   2022-02-17 17:38:50,483 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
   2022-02-17 17:38:50,492 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.199.142.10, 12710, None)
   2022-02-17 17:38:50,496 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.199.142.10:12710 with 366.3 MiB RAM, BlockManagerId(driver, 10.199.142.10, 12710, None)
   2022-02-17 17:38:50,499 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.199.142.10, 12710, None)
   2022-02-17 17:38:50,500 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.199.142.10, 12710, None)
   2022-02-17 17:38:50,814 INFO ui.ServerInfo: Adding filter to /metrics/json: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
   2022-02-17 17:38:50,817 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68b366e2{/metrics/json,null,AVAILABLE,@Spark}
   2022-02-17 17:38:50,856 INFO history.SingleEventLogFileWriter: Logging events to hdfs:/user/spark303/applicationHistory/application_1644485376937_0125.lz4.inprogress
   2022-02-17 17:38:51,006 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
   2022-02-17 17:38:53,519 INFO resource.ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 512, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
   2022-02-17 17:38:54,296 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.199.198.44:50562) with ID 2
   2022-02-17 17:38:54,415 INFO storage.BlockManagerMasterEndpoint: Registering block manager j-w1:35491 with 93.3 MiB RAM, BlockManagerId(2, j-w1, 35491, None)
   2022-02-17 17:38:56,237 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.199.199.229:53648) with ID 1
   2022-02-17 17:38:56,321 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
   2022-02-17 17:38:56,363 INFO storage.BlockManagerMasterEndpoint: Registering block manager j-w2:50684 with 93.3 MiB RAM, BlockManagerId(1, j-w2, 50684, None)
   2022-02-17 17:38:56,457 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Kafka could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1257)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Kafka.<init>(Kafka.scala:31)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
           ... 28 more
   2022-02-17 17:38:56,465 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Hive could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1257)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Hive.<init>(Hive.scala:29)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
           ... 28 more
   2022-02-17 17:38:56,467 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Phoenix could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1257)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Phoenix.<init>(Phoenix.scala:29)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
           ... 28 more
   2022-02-17 17:38:56,469 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Redis could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1257)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Redis.<init>(Redis.scala:31)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
           ... 28 more
   
   ```
   
   ### SeaTunnel Version
   
   v2.0.5 build on branch dev (hotspot jdk1.8.0_202)
   
   ### SeaTunnel Config
   
   ```conf
   env {
     spark.driver.host = "10.199.142.10"
     spark.app.name = "SeaTunnel_test_2.0.5_src_n3__testhost"
     spark.executor.instances = 2
     spark.executor.cores = 1
     spark.executor.memory = "512m"
     spark.sql.catalogImplementation = "hive"
   }
   source {
     hive {
       pre_sql = "select * from t_cicd.import_first__gw_activitylog where dt='2022-02-10'"
       result_table_name = "my_dataset"
     }
   }
   transform {
     # do nothing
   }
   sink {
     Console {}
   }
   ```
   
   
   ### Running Command
   
   ```shell
   seatunnel-dist-2.0.5-SNAPSHOT-2.12.10]$ ./bin/start-seatunnel-spark.sh --master yarn --deploy-mode client --config config/test.sql1.conf
   ```
   
   
   ### Error Exception
   
   ```log
   2022-02-17 17:38:56,469 WARN config.ConfigBuilder: Error when load plugin: [org.apache.seatunnel.spark.sink.Console]
   java.util.ServiceConfigurationError: org.apache.seatunnel.spark.BaseSparkSink: Provider org.apache.seatunnel.spark.sink.Redis could not be instantiated
           at java.util.ServiceLoader.fail(ServiceLoader.java:232)
           at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.seatunnel.config.ConfigBuilder.createPluginInstanceIgnoreCase(ConfigBuilder.java:137)
           at org.apache.seatunnel.config.ConfigBuilder.lambda$createPlugins$0(ConfigBuilder.java:170)
           at java.util.ArrayList.forEach(ArrayList.java:1257)
           at org.apache.seatunnel.config.ConfigBuilder.createPlugins(ConfigBuilder.java:168)
           at org.apache.seatunnel.Seatunnel.entryPoint(Seatunnel.java:97)
           at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:61)
           at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:29)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
           at org.apache.seatunnel.spark.sink.Redis.<init>(Redis.scala:31)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at java.lang.Class.newInstance(Class.java:442)
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
           ... 21 more
   Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
           at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
           ... 28 more
   ```
   
   
   ### Flink or Spark Version
   
   - spark-3.0.3-bin-hadoop3.2
   - hadoop-3.2.2
   
   ### Java or Scala Version
   
   _No response_
   
   ### Screenshots
   
   I connot paste a photo in my company, sorry
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-seatunnel] jiefzz commented on issue #1284: [Bug] [seatunnel-core-spark] ClassNotFoundException throw while run with spark3.0.3

Posted by GitBox <gi...@apache.org>.
jiefzz commented on issue #1284:
URL: https://github.com/apache/incubator-seatunnel/issues/1284#issuecomment-1046518116


   thanks reply, @leo65535 
   
   We update local copy's dependency version like:
   ```
   diff --git a/pom.xml b/pom.xml
   index 54e46b48..e3eb341d 100644
   --- a/pom.xml
   +++ b/pom.xml
   @@ -63,12 +63,12 @@
        <properties>
            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
            <java.version>1.8</java.version>
   -        <scala.version>2.11.8</scala.version>
   -        <scala.binary.version>2.11</scala.binary.version>
   +        <scala.version>2.12.10</scala.version>
   +        <scala.binary.version>2.12</scala.binary.version>
            <maven.compiler.source>${java.version}</maven.compiler.source>
            <maven.compiler.target>${java.version}</maven.compiler.target>
   -        <spark.version>2.4.0</spark.version>
   -        <spark.binary.version>2.4</spark.binary.version>
   +        <spark.version>3.0.1</spark.version>
   +        <spark.binary.version>3.0</spark.binary.version>
            <neo4j.connector.spark.version>4.1.0</neo4j.connector.spark.version>
            <flink.version>1.13.5</flink.version>
            <hudi.version>0.10.0</hudi.version>
   @@ -98,11 +98,11 @@
            <flink-shaded-hadoop-2.version>2.7.5-7.0</flink-shaded-hadoop-2.version>
            <parquet-avro.version>1.10.0</parquet-avro.version>
            <transport.version>6.3.1</transport.version>
   -        <elasticsearch-spark.version>6.8.3</elasticsearch-spark.version>
   +<!--        <elasticsearch-spark.version>6.8.3</elasticsearch-spark.version>-->
            <clickhouse-jdbc.version>0.2</clickhouse-jdbc.version>
            <hbase-spark.version>1.0.0</hbase-spark.version>
            <kudu-spark.version>1.7.0</kudu-spark.version>
   -        <mongo-spark.version>2.2.0</mongo-spark.version>
   +        <mongo-spark.version>2.4.1</mongo-spark.version>
            <spark-redis.version>2.6.0</spark-redis.version>
            <commons-lang3.version>3.4</commons-lang3.version>
            <maven-assembly-plugin.version>2.4</maven-assembly-plugin.version>
   ```
   
   it actually work on a demo config file patse above (select from hive, no thansforms action, use the console sink to print), it works, but i don't know it is this `java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class` has any side effect or not.
   
   Can you give me some tips?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-seatunnel] leo65535 commented on issue #1284: [Bug] [seatunnel-core-spark] ClassNotFoundException throw while run with spark3.0.3

Posted by GitBox <gi...@apache.org>.
leo65535 commented on issue #1284:
URL: https://github.com/apache/incubator-seatunnel/issues/1284#issuecomment-1044242812


   hi @jiefzz, seatunnel doesn't support spark.3.x currently.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org