You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@seatunnel.apache.org by GitBox <gi...@apache.org> on 2022/05/12 03:16:30 UTC
[GitHub] [incubator-seatunnel] vegastar002 opened a new issue, #1858: v2.1.1 run say no driver
vegastar002 opened a new issue, #1858:
URL: https://github.com/apache/incubator-seatunnel/issues/1858
ver
apache-seatunnel-incubating-2.1.1
spark-2.4.0-bin-hadoop2.6
### my.conf
path:
/opt/apache-seatunnel-incubating-2.1.1/config/my.conf
```xml
env {
spark.app.name = "SeaTunnel"
spark.executor.instances = 2
spark.executor.cores = 1
spark.executor.memory = "1g"
}
source {
jdbc {
driver = "oracle.jdbc.driver.OracleDriver"
url = "jdbc:oracle:thin://192.168.9.26:1521/dwe"
table = "TEST_DW"
result_table_name = "TEST_DW_log"
user = "123"
password = "123"
}
}
transform {
sql {
sql = "SELECT BH , DM , DWLB , JC ,QC FROM DSECS.TEST_DW "
}
}
sink {
clickhouse {
host = "192.168.9.103:8123"
clickhouse.socket_timeout = 50000
database = "default"
table = "dw"
fields = ["BH", "DM", "DWLB", "JC", "QC" ]
username = "root"
password = "1234567"
bulk_size = 20000
}
}
```
### step
run command
```shell
./bin/start-seatunnel-spark.sh --master local[4] --deploy-mode client --config ./config/my.conf
```
the error report
```
2022-05-12 10:47:03 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2022-05-12 10:47:04 INFO ConfigBuilder:59 - Loading config file: ./config/my.conf
2022-05-12 10:47:04 INFO ConfigBuilder:70 - parsed config file: {
"env" : {
"spark.app.name" : "SeaTunnel",
"spark.executor.instances" : 2,
"spark.executor.cores" : 1,
"spark.executor.memory" : "1g"
},
"source" : [
{
"password" : "dsecs",
"driver" : "oracle.jdbc.driver.OracleDriver",
"result_table_name" : "TEST_DW_log",
"plugin_name" : "jdbc",
"user" : "dsecs",
"url" : "jdbc:oracle:thin://192.168.9.236:1521/DSECS",
"table" : "TEST_DW"
}
],
"transform" : [
{
"plugin_name" : "sql",
"sql" : "SELECT BH , DM , DWLB , JC ,QC FROM DSECS.TEST_DW "
}
],
"sink" : [
{
"database" : "default",
"password" : "1234567",
"clickhouse.socket_timeout" : 50000,
"host" : "192.168.9.103:8123",
"bulk_size" : 20000,
"fields" : [
"BH",
"DM",
"DWLB",
"JC",
"QC"
],
"plugin_name" : "clickhouse",
"table" : "dw",
"username" : "root"
}
]
}
2022-05-12 10:47:04 INFO SparkContext:54 - Running Spark version 2.4.0
2022-05-12 10:47:04 INFO SparkContext:54 - Submitted application: SeaTunnel
2022-05-12 10:47:04 INFO SecurityManager:54 - Changing view acls to: root
2022-05-12 10:47:04 INFO SecurityManager:54 - Changing modify acls to: root
2022-05-12 10:47:04 INFO SecurityManager:54 - Changing view acls groups to:
2022-05-12 10:47:04 INFO SecurityManager:54 - Changing modify acls groups to:
2022-05-12 10:47:04 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
2022-05-12 10:47:05 INFO Utils:54 - Successfully started service 'sparkDriver' on port 46513.
2022-05-12 10:47:05 INFO SparkEnv:54 - Registering MapOutputTracker
2022-05-12 10:47:05 INFO SparkEnv:54 - Registering BlockManagerMaster
2022-05-12 10:47:05 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2022-05-12 10:47:05 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2022-05-12 10:47:05 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-02b28f86-bfbc-468d-8daa-bee1e2dd2528
2022-05-12 10:47:05 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2022-05-12 10:47:05 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2022-05-12 10:47:05 INFO log:192 - Logging initialized @4054ms
2022-05-12 10:47:05 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2022-05-12 10:47:05 INFO Server:419 - Started @4204ms
2022-05-12 10:47:05 INFO AbstractConnector:278 - Started ServerConnector@194152cf{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2022-05-12 10:47:05 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4604b900{/jobs,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@c827db{/jobs/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@377c68c6{/jobs/job,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@238ad8c{/jobs/job/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@430fa4ef{/stages,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1761de10{/stages/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@22df874e{/stages/stage,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@42d236fb{/stages/stage/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1ce93c18{/stages/pool,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@19f21b6b{/stages/pool/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1532c619{/storage,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@46044faa{/storage/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1358b28e{/storage/rdd,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1a78dacd{/storage/rdd/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@19f9d595{/environment,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7de4a01f{/environment/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2bfeb1ef{/executors,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@778ca8ef{/executors/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@208e9ef6{/executors/threadDump,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@78b236a0{/executors/threadDump/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@261d8190{/static,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@56f2bbea{/,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@78f9ed3e{/api,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7f4037ed{/jobs/job/kill,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@24e8de5c{/stages/stage/kill,null,AVAILABLE,@Spark}
2022-05-12 10:47:05 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://keep1:4040
2022-05-12 10:47:05 INFO SparkContext:54 - Added JAR file:/opt/apache-seatunnel-incubating-2.1.1/lib/seatunnel-core-spark.jar at spark://keep1:46513/jars/seatunnel-core-spark.jar with timestamp 1652323625721
2022-05-12 10:47:05 INFO Executor:54 - Starting executor ID driver on host localhost
2022-05-12 10:47:06 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44944.
2022-05-12 10:47:06 INFO NettyBlockTransferService:54 - Server created on keep1:44944
2022-05-12 10:47:06 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2022-05-12 10:47:06 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, keep1, 44944, None)
2022-05-12 10:47:06 INFO BlockManagerMasterEndpoint:54 - Registering block manager keep1:44944 with 366.3 MB RAM, BlockManagerId(driver, keep1, 44944, None)
2022-05-12 10:47:06 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, keep1, 44944, None)
2022-05-12 10:47:06 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, keep1, 44944, None)
2022-05-12 10:47:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@16c8b7bd{/metrics/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:07 INFO ClickHouseDriver:42 - Driver registered
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - ******** ############## ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - *######## ############## ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - *#*** **** ## ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - *#* ## ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - *#* ## ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - *#* ****** ******* ## ## ## ## ****** ## ****** ****** ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - *##** **#####* ####*##* ## ## ## ##**##**#* ##**##**#* **#####* ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - *##*** *#** **#* *** *#* ## ## ## ##**** *#* ##**** *#* *#** **#* ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - **##*** *#* *#* *#* ## ## ## ##** *#* ##** *#* *#* *#* ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - ***##** *#* *#* ## ## ## ## ##* ## ##* ## *#* *#* ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - **#** *########* *****## ## ## ## ## ## ## ## *########* ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - **#* *########* **##***## ## ## ## ## ## ## ## *########* ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - *#* *#* *#** ## ## ## ## ## ## ## ## *#* ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - #* *#* *#* ## ## ## *## ## ## ## ## *#* ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - *#* *#** *#* ##* ## *#* **## ## ## ## ## *#** ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - ***** ***#* **#*** *** *#*****##* ## *#* ****## ## ## ## ## **#*** *** ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - ########** **####### *#####**##* ## *#####**## ## ## ## ## **####### ##
2022-05-12 10:47:07 INFO AsciiArtUtils:69 - ********* ***#**** ********** ## ****** ## ## ## ## ## ***#**** ##
2022-05-12 10:47:08 INFO ExecutionFactory:80 - current execution is [org.apache.seatunnel.spark.batch.SparkBatchExecution]
2022-05-12 10:47:08 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/opt/apache-seatunnel-incubating-2.1.1/spark-warehouse').
2022-05-12 10:47:08 INFO SharedState:54 - Warehouse path is 'file:/opt/apache-seatunnel-incubating-2.1.1/spark-warehouse'.
2022-05-12 10:47:08 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@622fdb81{/SQL,null,AVAILABLE,@Spark}
2022-05-12 10:47:08 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1f3165e7{/SQL/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:08 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@15b82644{/SQL/execution,null,AVAILABLE,@Spark}
2022-05-12 10:47:08 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@20576557{/SQL/execution/json,null,AVAILABLE,@Spark}
2022-05-12 10:47:08 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3b1ed14b{/static/sql,null,AVAILABLE,@Spark}
2022-05-12 10:47:09 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2022-05-12 10:47:09 INFO Version:133 - Elasticsearch Hadoop v6.8.3 [8a5f44bf7d]
2022-05-12 10:47:09 ERROR Seatunnel:69 -
===============================================================================
2022-05-12 10:47:09 ERROR Seatunnel:72 - Fatal Error,
2022-05-12 10:47:09 ERROR Seatunnel:74 - Please submit bug report in https://github.com/apache/incubator-seatunnel/issues
2022-05-12 10:47:09 ERROR Seatunnel:76 - Reason:Execute Spark task error
2022-05-12 10:47:09 ERROR Seatunnel:77 - Exception StackTrace:java.lang.RuntimeException: Execute Spark task error
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:61)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:99)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:31)
at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:28)
at org.apache.seatunnel.spark.SparkEnvironment.registerInputTempView(SparkEnvironment.java:126)
at org.apache.seatunnel.spark.batch.SparkBatchExecution.lambda$start$0(SparkBatchExecution.java:45)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:45)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
... 15 more
2022-05-12 10:47:09 ERROR Seatunnel:78 -
===============================================================================
Exception in thread "main" java.lang.RuntimeException: Execute Spark task error
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:61)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:99)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:31)
at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:28)
at org.apache.seatunnel.spark.SparkEnvironment.registerInputTempView(SparkEnvironment.java:126)
at org.apache.seatunnel.spark.batch.SparkBatchExecution.lambda$start$0(SparkBatchExecution.java:45)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:45)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
... 15 more
2022-05-12 10:47:09 INFO SparkContext:54 - Invoking stop() from shutdown hook
2022-05-12 10:47:09 INFO AbstractConnector:318 - Stopped Spark@194152cf{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2022-05-12 10:47:09 INFO SparkUI:54 - Stopped Spark web UI at http://keep1:4040
2022-05-12 10:47:09 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2022-05-12 10:47:09 INFO MemoryStore:54 - MemoryStore cleared
2022-05-12 10:47:09 INFO BlockManager:54 - BlockManager stopped
2022-05-12 10:47:09 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2022-05-12 10:47:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2022-05-12 10:47:09 INFO SparkContext:54 - Successfully stopped SparkContext
2022-05-12 10:47:09 INFO ShutdownHookManager:54 - Shutdown hook called
2022-05-12 10:47:09 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-3a41298b-5c74-45ad-84ea-01de872c8f51
2022-05-12 10:47:09 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-f7ee6d32-ec39-4433-979a-542dcf335132
```
then I copy ojdbc6-11.2.0.3.jar and ojdbc14-10.2.0.1.jar to
/opt/apache-seatunnel-incubating-2.1.1/lib
but still report this error.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [incubator-seatunnel] chenhu commented on issue #1858: v2.1.1 run say no driver
Posted by GitBox <gi...@apache.org>.
chenhu commented on issue #1858:
URL: https://github.com/apache/incubator-seatunnel/issues/1858#issuecomment-1128326997
> then I copy ojdbc6-11.2.0.3.jar and ojdbc14-10.2.0.1.jar to /opt/apache-seatunnel-incubating-2.1.1/lib
>
> but still report this error.
Maybe you should create the some plugin dir in the path of ${SEATUNNEL_HOME}/plugins. eg: Oracle driver jars would create the dir of ${SEATUNNEL_HOME}/plugins/oracle/lib/[put your jars here]
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [incubator-seatunnel] ruanwenjun commented on issue #1858: v2.1.1 run say no driver
Posted by GitBox <gi...@apache.org>.
ruanwenjun commented on issue #1858:
URL: https://github.com/apache/incubator-seatunnel/issues/1858#issuecomment-1125585919
@vegastar002 You can see the log, this error is caused by the source table is not exist, you need to check if the `ds_data_source_config` exist in your database.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [incubator-seatunnel] vegastar002 commented on issue #1858: v2.1.1 run say no driver
Posted by GitBox <gi...@apache.org>.
vegastar002 commented on issue #1858:
URL: https://github.com/apache/incubator-seatunnel/issues/1858#issuecomment-1124634960
and another error:
### my.conf
path:
/opt/apache-seatunnel-incubating-2.1.1/config/my.conf
```
env {
# You can set spark configuration here
# see available properties defined by spark: https://spark.apache.org/docs/latest/configuration.html#available-properties
spark.app.name = "SeaTunnel"
spark.executor.instances = 2
spark.executor.cores = 1
spark.executor.memory = "1g"
}
source {
jdbc {
driver = "org.postgresql.Driver"
url = "jdbc:postgresql://192.168.9.97:25432/postgres"
table = "ds_data_source_config"
result_table_name = "ds_data_source_config_log"
user = "postgres"
password = "12qwaszx"
}
}
transform {
sql {
sql = "select id, name from postgres.public.ds_data_source_config"
}
}
sink {
clickhouse {
host = "192.168.9.103:8123"
clickhouse.socket_timeout = 50000
database = "default"
table = "dw2"
fields = ["id","name"]
username = "root"
password = "1234567"
bulk_size = 20000
}
}
```
run report
```
2022-05-12 15:29:03 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2022-05-12 15:29:03 INFO ConfigBuilder:59 - Loading config file: ./config/my.conf
2022-05-12 15:29:03 INFO ConfigBuilder:70 - parsed config file: {
"env" : {
"spark.app.name" : "SeaTunnel",
"spark.executor.instances" : 2,
"spark.executor.cores" : 1,
"spark.executor.memory" : "1g"
},
"source" : [
{
"password" : "12qwaszx",
"driver" : "org.postgresql.Driver",
"result_table_name" : "ds_data_source_config_log",
"plugin_name" : "jdbc",
"user" : "postgres",
"url" : "jdbc:postgresql://192.168.9.97:25432/postgres",
"table" : "ds_data_source_config"
}
],
"transform" : [
{
"plugin_name" : "sql",
"sql" : "select id, name from postgres.public.ds_data_source_config"
}
],
"sink" : [
{
"database" : "default",
"password" : "1234567",
"clickhouse.socket_timeout" : 50000,
"host" : "192.168.9.103:8123",
"bulk_size" : 20000,
"fields" : [
"id",
"name"
],
"plugin_name" : "clickhouse",
"table" : "dw2",
"username" : "root"
}
]
}
2022-05-12 15:29:03 INFO SparkContext:54 - Running Spark version 2.4.0
2022-05-12 15:29:03 INFO SparkContext:54 - Submitted application: SeaTunnel
2022-05-12 15:29:03 INFO SecurityManager:54 - Changing view acls to: root
2022-05-12 15:29:03 INFO SecurityManager:54 - Changing modify acls to: root
2022-05-12 15:29:03 INFO SecurityManager:54 - Changing view acls groups to:
2022-05-12 15:29:03 INFO SecurityManager:54 - Changing modify acls groups to:
2022-05-12 15:29:03 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
2022-05-12 15:29:04 INFO Utils:54 - Successfully started service 'sparkDriver' on port 43662.
2022-05-12 15:29:04 INFO SparkEnv:54 - Registering MapOutputTracker
2022-05-12 15:29:04 INFO SparkEnv:54 - Registering BlockManagerMaster
2022-05-12 15:29:04 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2022-05-12 15:29:04 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2022-05-12 15:29:04 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-b3534c8a-00b6-41ff-9e0b-08641cee5b5e
2022-05-12 15:29:04 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2022-05-12 15:29:04 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2022-05-12 15:29:04 INFO log:192 - Logging initialized @2632ms
2022-05-12 15:29:04 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2022-05-12 15:29:04 INFO Server:419 - Started @2750ms
2022-05-12 15:29:04 INFO AbstractConnector:278 - Started ServerConnector@1500e009{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2022-05-12 15:29:04 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6f815e7f{/jobs,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1ad926d3{/jobs/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3a43d133{/jobs/job,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5f2afe62{/jobs/job/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@c96a4ea{/stages,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@28782602{/stages/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@60c16548{/stages/stage,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@38b972d7{/stages/stage/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5339bbad{/stages/pool,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3935e9a8{/stages/pool/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@288a4658{/storage,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5b56b654{/storage/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@452c8a40{/storage/rdd,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@534243e4{/storage/rdd/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@29006752{/environment,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@470a9030{/environment/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@66d57c1b{/executors,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@27494e46{/executors/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@d59970a{/executors/threadDump,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1e411d81{/executors/threadDump/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@53b98ff6{/static,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@31611954{/,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3e598df9{/api,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@42cc13a0{/jobs/job/kill,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@32fdec40{/stages/stage/kill,null,AVAILABLE,@Spark}
2022-05-12 15:29:04 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://keep1:4040
2022-05-12 15:29:04 INFO SparkContext:54 - Added JAR file:/opt/apache-seatunnel-incubating-2.1.1/lib/seatunnel-core-spark.jar at spark://keep1:43662/jars/seatunnel-core-spark.jar with timestamp 1652340544644
2022-05-12 15:29:04 INFO Executor:54 - Starting executor ID driver on host localhost
2022-05-12 15:29:04 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46722.
2022-05-12 15:29:04 INFO NettyBlockTransferService:54 - Server created on keep1:46722
2022-05-12 15:29:04 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2022-05-12 15:29:04 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, keep1, 46722, None)
2022-05-12 15:29:04 INFO BlockManagerMasterEndpoint:54 - Registering block manager keep1:46722 with 366.3 MB RAM, BlockManagerId(driver, keep1, 46722, None)
2022-05-12 15:29:04 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, keep1, 46722, None)
2022-05-12 15:29:04 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, keep1, 46722, None)
2022-05-12 15:29:05 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6fc3e1a4{/metrics/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:05 INFO ClickHouseDriver:49 - Driver registered
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ******** ############## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *######## ############## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#*** **** ## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* ## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* ## ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* ****** ******* ## ## ## ## ****** ## ****** ****** ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *##** **#####* ####*##* ## ## ## ##**##**#* ##**##**#* **#####* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *##*** *#** **#* *** *#* ## ## ## ##**** *#* ##**** *#* *#** **#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - **##*** *#* *#* *#* ## ## ## ##** *#* ##** *#* *#* *#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ***##** *#* *#* ## ## ## ## ##* ## ##* ## *#* *#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - **#** *########* *****## ## ## ## ## ## ## ## *########* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - **#* *########* **##***## ## ## ## ## ## ## ## *########* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* *#* *#** ## ## ## ## ## ## ## ## *#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - #* *#* *#* ## ## ## *## ## ## ## ## *#* ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - *#* *#** *#* ##* ## *#* **## ## ## ## ## *#** ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ***** ***#* **#*** *** *#*****##* ## *#* ****## ## ## ## ## **#*** *** ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ########** **####### *#####**##* ## *#####**## ## ## ## ## **####### ##
2022-05-12 15:29:06 INFO AsciiArtUtils:69 - ********* ***#**** ********** ## ****** ## ## ## ## ## ***#**** ##
2022-05-12 15:29:06 INFO ExecutionFactory:80 - current execution is [org.apache.seatunnel.spark.batch.SparkBatchExecution]
2022-05-12 15:29:06 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/opt/apache-seatunnel-incubating-2.1.1/spark-warehouse/').
2022-05-12 15:29:06 INFO SharedState:54 - Warehouse path is 'file:/opt/apache-seatunnel-incubating-2.1.1/spark-warehouse/'.
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4af70944{/SQL,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@35267fd4{/SQL/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@42373389{/SQL/execution,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@9b21bd3{/SQL/execution/json,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7903d448{/static/sql,null,AVAILABLE,@Spark}
2022-05-12 15:29:06 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2022-05-12 15:29:07 INFO Version:133 - Elasticsearch Hadoop v6.8.3 [8a5f44bf7d]
2022-05-12 15:29:11 ERROR Seatunnel:69 -
===============================================================================
2022-05-12 15:29:11 ERROR Seatunnel:72 - Fatal Error,
2022-05-12 15:29:11 ERROR Seatunnel:74 - Please submit bug report in https://github.com/apache/incubator-seatunnel/issues
2022-05-12 15:29:11 ERROR Seatunnel:76 - Reason:Execute Spark task error
2022-05-12 15:29:11 ERROR Seatunnel:77 - Exception StackTrace:java.lang.RuntimeException: Execute Spark task error
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:61)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input '.' expecting <EOF>(line 1, pos 36)
== SQL ==
select id, name from postgres.public.ds_data_source_config
------------------------------------^^^
at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:241)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:117)
at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at org.apache.seatunnel.spark.transform.Sql.process(Sql.scala:27)
at org.apache.seatunnel.spark.SparkEnvironment.transformProcess(SparkEnvironment.java:144)
at org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:50)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
... 15 more
2022-05-12 15:29:11 ERROR Seatunnel:78 -
===============================================================================
Exception in thread "main" java.lang.RuntimeException: Execute Spark task error
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:61)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input '.' expecting <EOF>(line 1, pos 36)
== SQL ==
select id, name from postgres.public.ds_data_source_config
------------------------------------^^^
at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:241)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:117)
at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at org.apache.seatunnel.spark.transform.Sql.process(Sql.scala:27)
at org.apache.seatunnel.spark.SparkEnvironment.transformProcess(SparkEnvironment.java:144)
at org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:50)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
... 15 more
2022-05-12 15:29:11 INFO SparkContext:54 - Invoking stop() from shutdown hook
2022-05-12 15:29:11 INFO AbstractConnector:318 - Stopped Spark@1500e009{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2022-05-12 15:29:11 INFO SparkUI:54 - Stopped Spark web UI at http://keep1:4040
2022-05-12 15:29:11 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2022-05-12 15:29:11 INFO MemoryStore:54 - MemoryStore cleared
2022-05-12 15:29:11 INFO BlockManager:54 - BlockManager stopped
2022-05-12 15:29:11 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2022-05-12 15:29:11 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2022-05-12 15:29:11 INFO SparkContext:54 - Successfully stopped SparkContext
2022-05-12 15:29:11 INFO ShutdownHookManager:54 - Shutdown hook called
2022-05-12 15:29:11 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-b2898721-d3a1-410e-bb0a-b59cc05a383d
2022-05-12 15:29:11 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-533a29f1-c9b4-4e5c-9542-478724607b6c
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [incubator-seatunnel] ruanwenjun commented on issue #1858: v2.1.1 run say no driver
Posted by GitBox <gi...@apache.org>.
ruanwenjun commented on issue #1858:
URL: https://github.com/apache/incubator-seatunnel/issues/1858#issuecomment-1125069412
@vegastar002 You need to change your sql to `select id, name from ds_data_source_config_log `.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [incubator-seatunnel] vegastar002 commented on issue #1858: v2.1.1 run say no driver
Posted by GitBox <gi...@apache.org>.
vegastar002 commented on issue #1858:
URL: https://github.com/apache/incubator-seatunnel/issues/1858#issuecomment-1125566423
> select id, name from ds_data_source_config_log
@ruanwenjun no, still error
```
at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:121)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:106)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at org.apache.seatunnel.spark.transform.Sql.process(Sql.scala:27)
at org.apache.seatunnel.spark.SparkEnvironment.transformProcess(SparkEnvironment.java:144)
at org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:50)
at org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
... 15 more
Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchTableException: Table or view 'ds_data_source_config' not found in database 'default';
at org.apache.spark.sql.catalyst.catalog.ExternalCatalog$class.requireTableExists(ExternalCatalog.scala:48)
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.requireTableExists(InMemoryCatalog.scala:45)
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.getTable(InMemoryCatalog.scala:326)
at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.getTable(ExternalCatalogWithListener.scala:138)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupRelation(SessionCatalog.scala:701)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveRelations$$lookupTableFromCatalog(Analyzer.scala:730)
... 61 more
2022-05-13 09:20:25 INFO SparkContext:54 - Invoking stop() from shutdown hook
2022-05-13 09:20:25 INFO AbstractConnector:318 - Stopped Spark@d2387c8{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2022-05-13 09:20:25 INFO SparkUI:54 - Stopped Spark web UI at http://keep1:4040
2022-05-13 09:20:25 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2022-05-13 09:20:25 INFO MemoryStore:54 - MemoryStore cleared
2022-05-13 09:20:25 INFO BlockManager:54 - BlockManager stopped
2022-05-13 09:20:25 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2022-05-13 09:20:25 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2022-05-13 09:20:25 INFO SparkContext:54 - Successfully stopped SparkContext
2022-05-13 09:20:25 INFO ShutdownHookManager:54 - Shutdown hook called
2022-05-13 09:20:25 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-f76f3f32-9c78-4359-9913-4615f4432083
2022-05-13 09:20:25 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-22189ef0-e2e1-45e1-a2f3-7029bb0e70fc
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [incubator-seatunnel] ruanwenjun commented on issue #1858: v2.1.1 run say no driver
Posted by GitBox <gi...@apache.org>.
ruanwenjun commented on issue #1858:
URL: https://github.com/apache/incubator-seatunnel/issues/1858#issuecomment-1124554364
You need to copy the jdbc libs into spark lib dir.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org