You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Matthew Jones (JIRA)" <ji...@apache.org> on 2015/06/19 00:43:00 UTC
[jira] [Created] (SPARK-8463) No suitable driver found for
write.jdbc
Matthew Jones created SPARK-8463:
------------------------------------
Summary: No suitable driver found for write.jdbc
Key: SPARK-8463
URL: https://issues.apache.org/jira/browse/SPARK-8463
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 1.4.0, 1.5.0
Environment: Mesos, Ubuntu
Reporter: Matthew Jones
I am getting a java.sql.SQLException: No suitable driver found for jdbc:mysql://dbhost/test when using df.write.jdbc.
I do not get this error when reading from the same database.
This simple script can repeat the problem.
First one must create a database called test with a table called table1 and insert some rows in it. The user test:secret must have read/write permissions.
*testJDBC.scala:*
import java.util.Properties
import org.apache.spark.sql.Row
import java.sql.Struct
import org.apache.spark.sql.types.\{StructField, StructType, IntegerType, StringType}
import org.apache.spark.\{SparkConf, SparkContext}
import org.apache.spark.sql.SQLContext
val properties = new Properties()
properties.setProperty("user", "test")
properties.setProperty("password", "secret")
val readTable = sqlContext.read.jdbc("jdbc:mysql://dbhost/test", "table1", properties)
print(readTable.show())
val rows = sc.parallelize(List(Row(1, "write"), Row(2, "me")))
val writeTable = sqlContext.createDataFrame(rows, StructType(List(StructField("id", IntegerType), StructField("name", StringType))))
writeTable.write.jdbc("jdbc:mysql://dbhost/test", "table2", properties)}}
This is run using:
{{spark-shell --conf spark.executor.extraClassPath=/path/to/mysql-connector-java-5.1.35-bin.jar --driver-class-path /path/to/mysql-connector-java-5.1.35-bin.jar --jars /path/to/mysql-connector-java-5.1.35-bin.jar -i:testJDBC.scala}}
The read works fine and will print the rows in the table. The write fails with {{java.sql.SQLException: No suitable driver found for jdbc:mysql://dbhost/test}}
I have tested this on a Mesos cluster with Spark 1.4.0 and the current master branch as of June 18th.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org