You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@seatunnel.apache.org by GitBox <gi...@apache.org> on 2021/12/29 09:06:17 UTC

[GitHub] [incubator-seatunnel] leo65535 edited a comment on pull request #885: [SeaTunnel #884] Fix NullPointerException caused by engine type

leo65535 edited a comment on pull request #885:
URL: https://github.com/apache/incubator-seatunnel/pull/885#issuecomment-1002470990


   hi @simon824, here is the lstest result
   
   ### command
   ```
   ./bin/start-seatunnel-spark.sh -c config/application.conf -e client -m local -i app=test -t
   ```
   
   ### result
   ```
   [dcadmin@dcadmin seatunnel-dist-2.0.5-SNAPSHOT-2.11.8]$ vi config/application.conf 
   [dcadmin@dcadmin seatunnel-dist-2.0.5-SNAPSHOT-2.11.8]$ ./bin/start-seatunnel-spark.sh -c config/application.conf -e client -m local -i app=test -t
   
   [INFO] spark conf: --conf "spark.app.name=seatunnel"
   Warning: Ignoring non-spark config property: "spark.app.name=seatunnel"
   2021-12-29 15:31:55 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   2021-12-29 15:31:55 INFO  ConfigBuilder:78 - Loading config file: config/application.conf
   2021-12-29 15:31:55 INFO  ConfigBuilder:89 - parsed config file: {
       "env" : {
           "spark.app.name" : "seatunnel"
       },
       "source" : [
           {
               "result_table_name" : "my_dataset",
               "plugin_name" : "Fake"
           }
       ],
       "transform" : [
           {
               "plugin_name" : "sql",
               "table_name" : "user_view",
               "sql" : ${?table_name}"select * from user_view where city = '"${?sql}"'"
           }
       ],
       "sink" : [
           {
               "plugin_name" : "Console"
           }
       ]
   }
   
   2021-12-29 15:31:55 INFO  SparkContext:54 - Running Spark version 2.4.0
   2021-12-29 15:31:55 INFO  SparkContext:54 - Submitted application: seatunnel
   2021-12-29 15:31:55 INFO  SecurityManager:54 - Changing view acls to: dcadmin
   2021-12-29 15:31:55 INFO  SecurityManager:54 - Changing modify acls to: dcadmin
   2021-12-29 15:31:55 INFO  SecurityManager:54 - Changing view acls groups to: 
   2021-12-29 15:31:55 INFO  SecurityManager:54 - Changing modify acls groups to: 
   2021-12-29 15:31:55 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(dcadmin); groups with view permissions: Set(); users  with modify permissions: Set(dcadmin); groups with modify permissions: Set()
   2021-12-29 15:31:56 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 44517.
   2021-12-29 15:31:56 INFO  SparkEnv:54 - Registering MapOutputTracker
   2021-12-29 15:31:56 INFO  SparkEnv:54 - Registering BlockManagerMaster
   2021-12-29 15:31:56 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
   2021-12-29 15:31:56 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
   2021-12-29 15:31:56 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-9bc23ab1-2048-4316-a7ca-fd88799ab719
   2021-12-29 15:31:56 INFO  MemoryStore:54 - MemoryStore started with capacity 366.3 MB
   2021-12-29 15:31:56 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
   2021-12-29 15:31:56 INFO  log:192 - Logging initialized @3512ms
   2021-12-29 15:31:56 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
   2021-12-29 15:31:56 INFO  Server:419 - Started @3615ms
   2021-12-29 15:31:56 INFO  AbstractConnector:278 - Started ServerConnector@6079cf5{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
   2021-12-29 15:31:56 INFO  Utils:54 - Successfully started service 'SparkUI' on port 4040.
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6edaa77a{/jobs,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@749f539e{/jobs/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5ca1f591{/jobs/job,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6ef81f31{/jobs/job/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6075b2d3{/stages,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@33abde31{/stages/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@997d532{/stages/stage,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7a18e8d{/stages/stage/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3028e50e{/stages/pool,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5560bcdf{/stages/pool/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@b558294{/storage,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@bb095{/storage/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@777c350f{/storage/rdd,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@27aae97b{/storage/rdd/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4c9e38{/environment,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5d1e09bc{/environment/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4bdc8b5d{/executors,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3bcd426c{/executors/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5f14a673{/executors/threadDump,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@726a17c4{/executors/threadDump/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5dc3fcb7{/static,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6b3871d6{/,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@37095ded{/api,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7923f5b3{/jobs/job/kill,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6b63d445{/stages/stage/kill,null,AVAILABLE,@Spark}
   2021-12-29 15:31:56 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://dcadmin.work:4040
   2021-12-29 15:31:56 INFO  SparkContext:54 - Added JAR file:///work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/lib/seatunnel-core-flink.jar at spark://dcadmin.work:44517/jars/seatunnel-core-flink.jar with timestamp 1640763116838
   2021-12-29 15:31:56 INFO  SparkContext:54 - Added JAR file:///work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/lib/seatunnel-core-spark.jar at spark://dcadmin.work:44517/jars/seatunnel-core-spark.jar with timestamp 1640763116839
   2021-12-29 15:31:56 INFO  SparkContext:54 - Added JAR file:///work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/lib/seatunnel-core-sql-2.0.5-SNAPSHOT-2.11.8.jar at spark://dcadmin.work:44517/jars/seatunnel-core-sql-2.0.5-SNAPSHOT-2.11.8.jar with timestamp 1640763116840
   2021-12-29 15:31:56 WARN  SparkContext:66 - The jar file:/work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/lib/seatunnel-core-spark.jar has been added already. Overwriting of added jars is not supported in the current version.
   2021-12-29 15:31:56 INFO  Executor:54 - Starting executor ID driver on host localhost
   2021-12-29 15:31:57 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40581.
   2021-12-29 15:31:57 INFO  NettyBlockTransferService:54 - Server created on dcadmin.work:40581
   2021-12-29 15:31:57 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
   2021-12-29 15:31:57 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, dcadmin.work, 40581, None)
   2021-12-29 15:31:57 INFO  BlockManagerMasterEndpoint:54 - Registering block manager dcadmin.work:40581 with 366.3 MB RAM, BlockManagerId(driver, dcadmin.work, 40581, None)
   2021-12-29 15:31:57 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, dcadmin.work, 40581, None)
   2021-12-29 15:31:57 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, dcadmin.work, 40581, None)
   2021-12-29 15:31:57 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@77a2aa4a{/metrics/json,null,AVAILABLE,@Spark}
   2021-12-29 15:31:57 WARN  StreamingContext:66 - spark.master should be set as local[n], n > 1 in local mode if you have receivers to get data, otherwise Spark jobs will not get resources to process the received data.
   2021-12-29 15:31:57 ERROR Seatunnel:114 - Plugin[org.apache.seatunnel.spark.transform.Sql] contains invalid config, error: need to Config#resolve(), see the API docs for Config#resolve(); substitution not resolved: ConfigConcatenation(${?table_name}"select * from user_view where city = '"${?sql}"'") 
   
   2021-12-29 15:31:57 INFO  SparkContext:54 - Invoking stop() from shutdown hook
   2021-12-29 15:31:57 INFO  AbstractConnector:318 - Stopped Spark@6079cf5{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
   2021-12-29 15:31:57 INFO  SparkUI:54 - Stopped Spark web UI at http://dcadmin.work:4040
   2021-12-29 15:31:57 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
   2021-12-29 15:31:57 INFO  MemoryStore:54 - MemoryStore cleared
   2021-12-29 15:31:57 INFO  BlockManager:54 - BlockManager stopped
   2021-12-29 15:31:57 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
   2021-12-29 15:31:57 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
   2021-12-29 15:31:57 INFO  SparkContext:54 - Successfully stopped SparkContext
   2021-12-29 15:31:57 INFO  ShutdownHookManager:54 - Shutdown hook called
   2021-12-29 15:31:57 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-3f3ab240-66f9-4ab3-baad-050542b74e86
   2021-12-29 15:31:57 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-4231815e-59e3-49fe-a258-7d41780e030f
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org