You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@seatunnel.apache.org by GitBox <gi...@apache.org> on 2022/06/29 16:57:50 UTC

[GitHub] [incubator-seatunnel] hwfbest opened a new issue, #2089: [Bug] [connector-spark-jdbc-2.1.2] ClassNotFoundException: com.mysql.jdbc.Driver

hwfbest opened a new issue, #2089:
URL: https://github.com/apache/incubator-seatunnel/issues/2089

   ### Search before asking
   
   - [X] I had searched in the [issues](https://github.com/apache/incubator-seatunnel/issues?q=is%3Aissue+label%3A%22bug%22) and found no similar issues.
   
   
   ### What happened
   
   获取mysql数据源异常
   
   ### SeaTunnel Version
   
   2.1.2
   
   ### SeaTunnel Config
   
   ```conf
   env {  
   # seatunnel defined streaming batch duration in seconds
     spark.app.name = "SeaTunnel"
     spark.executor.instances = 1
     spark.executor.cores = 1
     spark.executor.memory = "1g"
     spark.sql.catalogImplementation = "hive"
     hive.exec.dynamici.partition=true
     hive.exec.dynamic.partition.mode= "nonstrict"
   
   }
   source {  
    jdbc {
       driver = "com.mysql.jdbc.Driver"
       url = "jdbc:mysql://10.8.3.73:3306/zhyx?useUnicode=true&characterEncoding=utf8&useSSL=false"
       table = "employee"
       result_table_name = "result_employee"
       user = "root"
       password = "east@1234"
   	}
     # hive {
     #   pre_sql = "select * from zhyx.han_001"
     #   result_table_name = "myTable"
     # }
   
   }
   transform {  
   #   sql {
   
   #     sql = "select id, name,substr(deg,1,6) month_id,deg as day_id from result_employee",
   
   # }
    }
   sink {  
     #   Hive {
     #     sql = "insert overwrite table zhyx.han_001 partition(month_id,day_id) select id,name,month_id,day_id from fake_age"
     # }
     console {
       serializer = "json"
     }
   }
   ```
   
   
   ### Running Command
   
   ```shell
   ./bin/start-seatunnel-spark.sh --master local --deploy-mode client --config ./config/application_002.conf
   ```
   
   
   ### Error Exception
   
   ```log
   22/06/30 00:40:59 INFO config.ExecutionFactory: current execution is [org.apache.seatunnel.spark.batch.SparkBatchExecution]
   22/06/30 00:40:59 INFO internal.SharedState: loading hive config file: file:/etc/hive/conf.cloudera.hive/hive-site.xml
   22/06/30 00:40:59 INFO internal.SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse').
   22/06/30 00:40:59 INFO internal.SharedState: Warehouse path is '/user/hive/warehouse'.
   22/06/30 00:40:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3f362135{/SQL,null,AVAILABLE,@Spark}
   22/06/30 00:40:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@21eee94f{/SQL/json,null,AVAILABLE,@Spark}
   22/06/30 00:40:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2a097d77{/SQL/execution,null,AVAILABLE,@Spark}
   22/06/30 00:40:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@53c1179a{/SQL/execution/json,null,AVAILABLE,@Spark}
   22/06/30 00:40:59 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1a47a1e8{/static/sql,null,AVAILABLE,@Spark}
   22/06/30 00:41:00 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
   22/06/30 00:41:00 ERROR base.Seatunnel: 
   
   ===============================================================================
   
   
   22/06/30 00:41:00 ERROR base.Seatunnel: Fatal Error, 
   
   22/06/30 00:41:00 ERROR base.Seatunnel: Please submit bug report in https://github.com/apache/incubator-seatunnel/issues
   
   22/06/30 00:41:00 ERROR base.Seatunnel: Reason:Execute Spark task error 
   
   22/06/30 00:41:00 ERROR base.Seatunnel: Exception StackTrace:java.lang.RuntimeException: Execute Spark task error
           at org.apache.seatunnel.core.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:79)
           at org.apache.seatunnel.core.base.Seatunnel.run(Seatunnel.java:39)
           at org.apache.seatunnel.core.spark.SeatunnelSpark.main(SeatunnelSpark.java:32)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
           at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
           at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
           at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
           at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
           at scala.Option.foreach(Option.scala:257)
           at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:99)
           at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
           at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
           at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:317)
           at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
           at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
           at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
           at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:31)
           at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:28)
           at org.apache.seatunnel.spark.SparkEnvironment.registerInputTempView(SparkEnvironment.java:141)
           at org.apache.seatunnel.spark.batch.SparkBatchExecution.lambda$start$0(SparkBatchExecution.java:45)
           at java.util.ArrayList.forEach(ArrayList.java:1257)
           at org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:45)
           at org.apache.seatunnel.core.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:76)
           ... 14 more
    
   22/06/30 00:41:00 ERROR base.Seatunnel:
   ```
   
   
   ### Flink or Spark Version
   
   cdh6.2.1
   spark 2.4.0
   
   ### Java or Scala Version
   
   java1.8
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] Hisoka-X closed issue #2089: [Bug] [connector-spark-jdbc-2.1.2] ClassNotFoundException: com.mysql.jdbc.Driver

Posted by GitBox <gi...@apache.org>.
Hisoka-X closed issue #2089: [Bug] [connector-spark-jdbc-2.1.2] ClassNotFoundException: com.mysql.jdbc.Driver
URL: https://github.com/apache/incubator-seatunnel/issues/2089


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] huhuhuHR commented on issue #2089: [Bug] [connector-spark-jdbc-2.1.2] ClassNotFoundException: com.mysql.jdbc.Driver

Posted by "huhuhuHR (via GitHub)" <gi...@apache.org>.
huhuhuHR commented on issue #2089:
URL: https://github.com/apache/incubator-seatunnel/issues/2089#issuecomment-1405026081

   楼上能把解决方法贴一下么,我是初学者也遇到了


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] hwfbest commented on issue #2089: [Bug] [connector-spark-jdbc-2.1.2] ClassNotFoundException: com.mysql.jdbc.Driver

Posted by GitBox <gi...@apache.org>.
hwfbest commented on issue #2089:
URL: https://github.com/apache/incubator-seatunnel/issues/2089#issuecomment-1170256321

   已解决,在spark的目录中添加mysql的jdbc包后,异常解决


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org