You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@seatunnel.apache.org by GitBox <gi...@apache.org> on 2022/05/12 07:34:49 UTC
[GitHub] [incubator-seatunnel] vegastar002 commented on issue #1858: v2.1.1 run say no driver
vegastar002 commented on issue #1858:
URL: https://github.com/apache/incubator-seatunnel/issues/1858#issuecomment-1124634960
and another error:
### my.conf
path:
/opt/apache-seatunnel-incubating-2.1.1/config/my.conf
```
env {
# You can set spark configuration here
# see available properties defined by spark: https://spark.apache.org/docs/latest/configuration.html#available-properties
spark.app.name = "SeaTunnel"
spark.executor.instances = 2
spark.executor.cores = 1
spark.executor.memory = "1g"
}
source {
jdbc {
driver = "org.postgresql.Driver"
url = "jdbc:postgresql://192.168.9.97:25432/postgres"
table = "ds_data_source_config"
result_table_name = "ds_data_source_config_log"
user = "postgres"
password = "12qwaszx"
}
}
transform {
sql {
sql = "select id, name from postgres.public.ds_data_source_config"
}
}
sink {
clickhouse {
host = "192.168.9.103:8123"
clickhouse.socket_timeout = 50000
database = "default"
table = "dw2"
fields = ["id","name"]
username = "root"
password = "1234567"
bulk_size = 20000
}
}
```
run report
```
2022-05-12 15:29:03 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2022-05-12 15:29:03 INFO ConfigBuilder:59 - Loading config file: ./config/my.conf
2022-05-12 15:29:03 INFO ConfigBuilder:70 - parsed config file: {
"env" : {
"spark.app.name" : "SeaTunnel",
"spark.executor.instances" : 2,
"spark.executor.cores" : 1,
"spark.executor.memory" : "1g"
},
"source" : [
{
"password" : "12qwaszx",
"driver" : "org.postgresql.Driver",
"result_table_name" : "ds_data_source_config_log",
"plugin_name" : "jdbc",
"user" : "postgres",
"url" : "jdbc:postgresql://192.168.9.97:25432/postgres",
"table" : "ds_data_source_config"
}
],
"transform" : [
{
"plugin_name" : "sql",
"sql" : "select id, name from postgres.public.ds_data_source_config"
}
],
"sink" : [
{
"database" : "default",
"password" : "1234567",
"clickhouse.socket_timeout" : 50000,
"host" : "192.168.9.103:8123",
"bulk_size" : 20000,
"fields" : [
"id",
"name"
],
"plugin_name" : "clickhouse",
"table" : "dw2",
"username" : "root"
}
]
}
2022-05-12 15:29:03 INFO SparkContext:54 - Running Spark version 2.4.0
2022-05-12 15:29:03 INFO SparkContext:54 - Submitted application: SeaTunnel
2022-05-12 15:29:03 INFO SecurityManager:54 - Changing view acls to: root
2022-05-12 15:29:03 INFO SecurityManager:54 - Changing modify acls to: root
2022-05-12 15:29:03 INFO SecurityManager:54 - Changing view acls groups to:
2022-05-12 15:29:03 INFO SecurityManager:54 - Changing modify acls groups to:
2022-05-12 15:29:03 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
2022-05-12 15:29:04 INFO Utils:54 - Successfully started service 'sparkDriver' on port 43662.
2022-05-12 15:29:04 INFO SparkEnv:54 - Registering MapOutputTracker
2022-05-12 15:29:04 INFO SparkEnv:54 - Registering BlockManagerMaster
2022-05-12 15:29:04 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2022-05-12 15:29:04 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2022-05-12 15:29:04 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-b3534c8a-00b6-41ff-9e0b-08641cee5b5e
2022-05-12 15:29:04 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2022-05-12 15:29:04 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2022-05-12 15:29:04 INFO log:192 - Logging initialized @2632ms
2022-05-12 15:29:04 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2022-05-12 15:29:04 INFO Server:419 - Started @2750ms
2022-05-12 15:29:04 INFO AbstractConnector:278 - Started ServerConnector@1500e009{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2022-05-12 15:29:04 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6f815e7f{/jobs,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1ad926d3{/jobs/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3a43d133{/jobs/job,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5f2afe62{/jobs/job/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@c96a4ea{/stages,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@28782602{/stages/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@60c16548{/stages/stage,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@38b972d7{/stages/stage/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5339bbad{/stages/pool,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3935e9a8{/stages/pool/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@288a4658{/storage,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5b56b654{/storage/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@452c8a40{/storage/rdd,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@534243e4{/storage/rdd/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@29006752{/environment,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@470a9030{/environment/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@66d57c1b{/executors,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@27494e46{/executors/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@d59970a{/executors/threadDump,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1e411d81{/executors/threadDump/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@53b98ff6{/static,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@31611954{/,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3e598df9{/api,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@42cc13a0{/jobs/job/kill,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@32fdec40{/stages/stage/kill,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://keep1:4040
2022-05-12 15:29:04 INFO SparkContext:54 - Added JAR file:/opt/apache-seatunnel-incubating-2.1.1/lib/seatunnel-core-spark.jar at spark://keep1:43662/jars/seatunnel-core-spark.jar with timestamp 1652340544644
2022-05-12 15:29:04 INFO Executor:54 - Starting executor ID driver on host localhost
2022-05-12 15:29:04 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46722.
2022-05-12 15:29:04 INFO NettyBlockTransferService:54 - Server created on keep1:46722
2022-05-12 15:29:04 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2022-05-12 15:29:04 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, keep1, 46722, None)
2022-05-12 15:29:04 INFO BlockManagerMasterEndpoint:54 - Registering block manager keep1:46722 with 366.3 MB RAM, BlockManagerId(driver, keep1, 46722, None)
2022-05-12 15:29:04 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, keep1, 46722, None)
2022-05-12 15:29:04 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, keep1, 46722, None)
2022-05-12 15:29:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6fc3e1a4{/metrics/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:05 INFO ClickHouseDriver:49 - Driver registered
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ******** ############## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *######## ############## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#*** **** ## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* ## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* ## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* ****** ******* ## ## ## ## ****** ## ****** ****** ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *##** **#####* ####*##* ## ## ## ##**##**#* ##**##**#* **#####* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *##*** *#** **#* *** *#* ## ## ## ##**** *#* ##**** *#* *#** **#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - **##*** *#* *#* *#* ## ## ## ##** *#* ##** *#* *#* *#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ***##** *#* *#* ## ## ## ## ##* ## ##* ## *#* *#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - **#** *########* *****## ## ## ## ## ## ## ## *########* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - **#* *########* **##***## ## ## ## ## ## ## ## *########* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* *#* *#** ## ## ## ## ## ## ## ## *#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - #* *#* *#* ## ## ## *## ## ## ## ## *#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* *#** *#* ##* ## *#* **## ## ## ## ## *#** ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ***** ***#* **#*** *** *#*****##* ## *#* ****## ## ## ## ## **#*** *** ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ########** **####### *#####**##* ## *#####**## ## ## ## ## **####### ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ********* ***#**** ********** ## ****** ## ## ## ## ## ***#**** ##
2022-05-12 15:29:06 INFO ExecutionFactory:80 - current execution is [org.apache.seatunnel.spark.batch.SparkBatchExecution]
2022-05-12 15:29:06 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/opt/apache-seatunnel-incubating-2.1.1/spark-warehouse/').
2022-05-12 15:29:06 INFO SharedState:54 - Warehouse path is 'file:/opt/apache-seatunnel-incubating-2.1.1/spark-warehouse/'.
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4af70944{/SQL,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@35267fd4{/SQL/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@42373389{/SQL/execution,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@9b21bd3{/SQL/execution/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7903d448{/static/sql,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2022-05-12 15:29:07 INFO Version:133 - Elasticsearch Hadoop v6.8.3 [8a5f44bf7d]
2022-05-12 15:29:11 ERROR Seatunnel:69 -
===============================================================================
2022-05-12 15:29:11 ERROR Seatunnel:72 - Fatal Error,
2022-05-12 15:29:11 ERROR Seatunnel:74 - Please submit bug report in https://github.com/apache/incubator-seatunnel/issues
2022-05-12 15:29:11 ERROR Seatunnel:76 - Reason:Execute Spark task error
2022-05-12 15:29:11 ERROR Seatunnel:77 - Exception StackTrace:java.lang.RuntimeException: Execute Spark task error
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:61)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input '.' expecting <EOF>(line 1, pos 36)
== SQL ==
select id, name from postgres.public.ds_data_source_config
------------------------------------^^^
at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:241)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:117)
at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at org.apache.seatunnel.spark.transform.Sql.process(Sql.scala:27)
at org.apache.seatunnel.spark.SparkEnvironment.transformProcess(SparkEnvironment.java:144)
at org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:50)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
... 15 more
2022-05-12 15:29:11 ERROR Seatunnel:78 -
===============================================================================
Exception in thread "main" java.lang.RuntimeException: Execute Spark task error
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:61)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input '.' expecting <EOF>(line 1, pos 36)
== SQL ==
select id, name from postgres.public.ds_data_source_config
------------------------------------^^^
at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:241)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:117)
at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at org.apache.seatunnel.spark.transform.Sql.process(Sql.scala:27)
at org.apache.seatunnel.spark.SparkEnvironment.transformProcess(SparkEnvironment.java:144)
at org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:50)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
... 15 more
2022-05-12 15:29:11 INFO SparkContext:54 - Invoking stop() from shutdown hook
2022-05-12 15:29:11 INFO AbstractConnector:318 - Stopped Spark@1500e009{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2022-05-12 15:29:11 INFO SparkUI:54 - Stopped Spark web UI at http://keep1:4040
2022-05-12 15:29:11 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2022-05-12 15:29:11 INFO MemoryStore:54 - MemoryStore cleared
2022-05-12 15:29:11 INFO BlockManager:54 - BlockManager stopped
2022-05-12 15:29:11 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2022-05-12 15:29:11 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2022-05-12 15:29:11 INFO SparkContext:54 - Successfully stopped SparkContext
2022-05-12 15:29:11 INFO ShutdownHookManager:54 - Shutdown hook called
2022-05-12 15:29:11 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-b2898721-d3a1-410e-bb0a-b59cc05a383d
2022-05-12 15:29:11 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-533a29f1-c9b4-4e5c-9542-478724607b6c
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org