You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@seatunnel.apache.org by GitBox <gi...@apache.org> on 2022/08/24 00:36:38 UTC

[GitHub] [incubator-seatunnel] yihui8776 commented on issue #2496: [Bug] docker spark docker-compose up ERROR: services.seatunnel.depends_on contains an invalid type, it should be an array

yihui8776 commented on issue #2496:
URL: https://github.com/apache/incubator-seatunnel/issues/2496#issuecomment-1225028418

   I  update the docker  compose  and start up but got another   exception
   
   
    docker-compose up
   Starting spark_datasource_1 ... done
   Starting spark_seatunnel_1  ... done
   Attaching to spark_datasource_1, spark_seatunnel_1
   seatunnel_1   | Execute SeaTunnel Spark Job: ${SPARK_HOME}/bin/spark-submit --class "org.apache.seatunnel.core.spark.SeatunnelSpark" --name "seatunnel" --master "local[4]" --deploy-mode "client" --conf "spark.streaming.batchDuration=5" --conf "spark.app.name=seatunnel" --conf "spark.ui.port=13000" /seatunnel/lib/seatunnel-core-spark.jar --master local[4] --deploy-mode client --config /application.conf -t
   spark_datasource_1 exited with code 0
   seatunnel_1   | 2022-08-24 00:32:43 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   seatunnel_1   | 2022-08-24 00:32:43 INFO  ConfigBuilder:54 - Loading config file: /application.conf
   seatunnel_1   | 2022-08-24 00:32:43 INFO  ConfigBuilder:65 - parsed config file: {
   seatunnel_1   |     "env" : {
   seatunnel_1   |         "spark.streaming.batchDuration" : 5,
   seatunnel_1   |         "spark.app.name" : "seatunnel",
   seatunnel_1   |         "spark.ui.port" : 13000
   seatunnel_1   |     },
   seatunnel_1   |     "source" : [
   seatunnel_1   |         {
   seatunnel_1   |             "path" : "/tmp/source",
   seatunnel_1   |             "format" : "text",
   seatunnel_1   |             "result_table_name" : "test",
   seatunnel_1   |             "plugin_name" : "file"
   seatunnel_1   |         }
   seatunnel_1   |     ],
   seatunnel_1   |     "transform" : [
   seatunnel_1   |         {
   seatunnel_1   |             "delimiter" : ",",
   seatunnel_1   |             "fields" : [
   seatunnel_1   |                 "msg",
   seatunnel_1   |                 "name"
   seatunnel_1   |             ],
   seatunnel_1   |             "plugin_name" : "split"
   seatunnel_1   |         }
   seatunnel_1   |     ],
   seatunnel_1   |     "sink" : [
   seatunnel_1   |         {
   seatunnel_1   |             "plugin_name" : "console"
   seatunnel_1   |         }
   seatunnel_1   |     ]
   seatunnel_1   | }
   seatunnel_1   |
   seatunnel_1   | 2022-08-24 00:32:43 INFO  SparkContext:54 - Running Spark version 2.4.0
   seatunnel_1   | 2022-08-24 00:32:43 INFO  SparkContext:54 - Submitted application: seatunnel
   seatunnel_1   | 2022-08-24 00:32:43 INFO  SecurityManager:54 - Changing view acls to: root
   seatunnel_1   | 2022-08-24 00:32:43 INFO  SecurityManager:54 - Changing modify acls to: root
   seatunnel_1   | 2022-08-24 00:32:43 INFO  SecurityManager:54 - Changing view acls groups to:
   seatunnel_1   | 2022-08-24 00:32:43 INFO  SecurityManager:54 - Changing modify acls groups to:
   seatunnel_1   | 2022-08-24 00:32:43 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
   seatunnel_1   | 2022-08-24 00:32:43 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 39869.
   seatunnel_1   | 2022-08-24 00:32:43 INFO  SparkEnv:54 - Registering MapOutputTracker
   seatunnel_1   | 2022-08-24 00:32:43 INFO  SparkEnv:54 - Registering BlockManagerMaster
   seatunnel_1   | 2022-08-24 00:32:43 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
   seatunnel_1   | 2022-08-24 00:32:43 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
   seatunnel_1   | 2022-08-24 00:32:43 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-d8ef9afb-34d9-4def-9242-8062271d871b
   seatunnel_1   | 2022-08-24 00:32:44 INFO  MemoryStore:54 - MemoryStore started with capacity 366.3 MB
   seatunnel_1   | 2022-08-24 00:32:44 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
   seatunnel_1   | 2022-08-24 00:32:44 INFO  log:192 - Logging initialized @1402ms
   seatunnel_1   | 2022-08-24 00:32:44 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
   seatunnel_1   | 2022-08-24 00:32:44 INFO  Server:419 - Started @1461ms
   seatunnel_1   | 2022-08-24 00:32:44 INFO  AbstractConnector:278 - Started ServerConnector@64d43929{HTTP/1.1,[http/1.1]}{0.0.0.0:13000}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  Utils:54 - Successfully started service 'SparkUI' on port 13000.
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@22d6cac2{/jobs,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2fa3be26{/jobs/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4287d447{/jobs/job,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4e6d7365{/jobs/job/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7c0da600{/stages,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@d4602a{/stages/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@21ae6e73{/stages/stage,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@36a7abe1{/stages/stage/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@64a896b0{/stages/pool,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@e044b4a{/stages/pool/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@11a82d0f{/storage,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1adb7478{/storage/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3ae66c85{/storage/rdd,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@16943e88{/storage/rdd/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4604b900{/environment,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@73d6d0c{/environment/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@e36bb2a{/executors,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3961a41a{/executors/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5a4ed68f{/executors/threadDump,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@367795c7{/executors/threadDump/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@d2387c8{/static,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@538cd0f2{/,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@238ad8c{/api,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@654c1a54{/jobs/job/kill,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5bdaf2ce{/stages/stage/kill,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://bb8919f4e108:13000
   seatunnel_1   | 2022-08-24 00:32:44 INFO  SparkContext:54 - Added JAR file:/seatunnel/lib/seatunnel-core-spark.jar at spark://bb8919f4e108:39869/jars/seatunnel-core-spark.jar with timestamp 1661301164210
   seatunnel_1   | 2022-08-24 00:32:44 INFO  Executor:54 - Starting executor ID driver on host localhost
   seatunnel_1   | 2022-08-24 00:32:44 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40693.
   seatunnel_1   | 2022-08-24 00:32:44 INFO  NettyBlockTransferService:54 - Server created on bb8919f4e108:40693
   seatunnel_1   | 2022-08-24 00:32:44 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
   seatunnel_1   | 2022-08-24 00:32:44 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, bb8919f4e108, 40693, None)
   seatunnel_1   | 2022-08-24 00:32:44 INFO  BlockManagerMasterEndpoint:54 - Registering block manager bb8919f4e108:40693 with 366.3 MB RAM, BlockManagerId(driver, bb8919f4e108, 40693, None)
   seatunnel_1   | 2022-08-24 00:32:44 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, bb8919f4e108, 40693, None)
   seatunnel_1   | 2022-08-24 00:32:44 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, bb8919f4e108, 40693, None)
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@35ff8fc9{/metrics/json,null,AVAILABLE,@Spark}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  AbstractPluginDiscovery:77 - Load BaseSparkSource Plugin from /seatunnel/connectors/spark
   seatunnel_1   | 2022-08-24 00:32:44 INFO  AbstractPluginDiscovery:77 - Load BaseSparkTransform Plugin from /seatunnel/connectors/spark
   seatunnel_1   | 2022-08-24 00:32:44 INFO  AbstractPluginDiscovery:77 - Load BaseSparkSink Plugin from /seatunnel/connectors/spark
   seatunnel_1   | 2022-08-24 00:32:44 INFO  SparkEnvironment:93 - register plugins :[]
   seatunnel_1   | 2022-08-24 00:32:44 ERROR Seatunnel:61 -
   seatunnel_1   |
   seatunnel_1   | ===============================================================================
   seatunnel_1   |
   seatunnel_1   |
   seatunnel_1   | 2022-08-24 00:32:44 ERROR Seatunnel:64 - Fatal Error,
   seatunnel_1   |
   seatunnel_1   | 2022-08-24 00:32:44 ERROR Seatunnel:66 - Please submit bug report in https://github.com/apache/incubator-seatunnel/issues
   seatunnel_1   |
   seatunnel_1   | 2022-08-24 00:32:44 ERROR Seatunnel:68 - Reason:Plugin PluginIdentifier{engineType='spark', pluginType='source', pluginName='file'} not found.
   seatunnel_1   |
   seatunnel_1   | 2022-08-24 00:32:44 ERROR Seatunnel:69 - Exception StackTrace:java.lang.RuntimeException: Plugin PluginIdentifier{engineType='spark', pluginType='source', pluginName='file'} not found.
   seatunnel_1   |         at org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery.createPluginInstance(AbstractPluginDiscovery.java:122)
   seatunnel_1   |         at org.apache.seatunnel.core.spark.config.SparkExecutionContext.lambda$getSources$0(SparkExecutionContext.java:67)
   seatunnel_1   |         at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   seatunnel_1   |         at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
   seatunnel_1   |         at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
   seatunnel_1   |         at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
   seatunnel_1   |         at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   seatunnel_1   |         at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   seatunnel_1   |         at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
   seatunnel_1   |         at org.apache.seatunnel.core.spark.config.SparkExecutionContext.getSources(SparkExecutionContext.java:70)
   seatunnel_1   |         at org.apache.seatunnel.core.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:55)
   seatunnel_1   |         at org.apache.seatunnel.core.base.Seatunnel.run(Seatunnel.java:40)
   seatunnel_1   |         at org.apache.seatunnel.core.spark.SeatunnelSpark.main(SeatunnelSpark.java:33)
   seatunnel_1   |         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   seatunnel_1   |         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   seatunnel_1   |         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   seatunnel_1   |         at java.lang.reflect.Method.invoke(Method.java:498)
   seatunnel_1   |         at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   seatunnel_1   |
   seatunnel_1   | 2022-08-24 00:32:44 ERROR Seatunnel:70 -
   seatunnel_1   | ===============================================================================
   seatunnel_1   |
   seatunnel_1   |
   seatunnel_1   |
   seatunnel_1   | Exception in thread "main" java.lang.RuntimeException: Plugin PluginIdentifier{engineType='spark', pluginType='source', pluginName='file'} not found.
   seatunnel_1   |         at org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery.createPluginInstance(AbstractPluginDiscovery.java:122)
   seatunnel_1   |         at org.apache.seatunnel.core.spark.config.SparkExecutionContext.lambda$getSources$0(SparkExecutionContext.java:67)
   seatunnel_1   |         at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   seatunnel_1   |         at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
   seatunnel_1   |         at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
   seatunnel_1   |         at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
   seatunnel_1   |         at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   seatunnel_1   |         at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   seatunnel_1   |         at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
   seatunnel_1   |         at org.apache.seatunnel.core.spark.config.SparkExecutionContext.getSources(SparkExecutionContext.java:70)
   seatunnel_1   |         at org.apache.seatunnel.core.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:55)
   seatunnel_1   |         at org.apache.seatunnel.core.base.Seatunnel.run(Seatunnel.java:40)
   seatunnel_1   |         at org.apache.seatunnel.core.spark.SeatunnelSpark.main(SeatunnelSpark.java:33)
   seatunnel_1   |         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   seatunnel_1   |         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   seatunnel_1   |         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   seatunnel_1   |         at java.lang.reflect.Method.invoke(Method.java:498)
   seatunnel_1   |         at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
   seatunnel_1   |         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   seatunnel_1   | 2022-08-24 00:32:44 INFO  SparkContext:54 - Invoking stop() from shutdown hook
   seatunnel_1   | 2022-08-24 00:32:44 INFO  AbstractConnector:318 - Stopped Spark@64d43929{HTTP/1.1,[http/1.1]}{0.0.0.0:13000}
   seatunnel_1   | 2022-08-24 00:32:44 INFO  SparkUI:54 - Stopped Spark web UI at http://bb8919f4e108:13000
   seatunnel_1   | 2022-08-24 00:32:44 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
   seatunnel_1   | 2022-08-24 00:32:44 INFO  MemoryStore:54 - MemoryStore cleared
   seatunnel_1   | 2022-08-24 00:32:44 INFO  BlockManager:54 - BlockManager stopped
   seatunnel_1   | 2022-08-24 00:32:44 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
   seatunnel_1   | 2022-08-24 00:32:44 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
   seatunnel_1   | 2022-08-24 00:32:44 INFO  SparkContext:54 - Successfully stopped SparkContext
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ShutdownHookManager:54 - Shutdown hook called
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-37613600-1da8-4fc4-9485-a8b32d082e43
   seatunnel_1   | 2022-08-24 00:32:44 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-8aa66517-49d0-4537-b32a-51a2ba5ef41a  


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org