You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/08/16 17:40:45 UTC
[jira] [Commented] (SPARK-10036) DataFrameReader.json and
DataFrameWriter.json don't load the JDBC driver class before creating JDBC
connection
[ https://issues.apache.org/jira/browse/SPARK-10036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14698727#comment-14698727 ]
Apache Spark commented on SPARK-10036:
--------------------------------------
User 'zsxwing' has created a pull request for this issue:
https://github.com/apache/spark/pull/8232
> DataFrameReader.json and DataFrameWriter.json don't load the JDBC driver class before creating JDBC connection
> --------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-10036
> URL: https://issues.apache.org/jira/browse/SPARK-10036
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Reporter: Shixiong Zhu
>
> Here is the reproduce code and the stack trace
> {code}
> val url = "jdbc:postgresql://.../mytest"
> import java.util.Properties
> val prop = new Properties()
> prop.put("driver", "org.postgresql.Driver")
> prop.put("user", "...")
> prop.put("password", "...")
> val df = sqlContext.read.jdbc(url, "mytest", prop)
> {code}
> {code}
> java.sql.SQLException: No suitable driver found for jdbc:postgresql://.../mytest
> at java.sql.DriverManager.getConnection(DriverManager.java:689)
> at java.sql.DriverManager.getConnection(DriverManager.java:208)
> at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:121)
> at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91)
> at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:200)
> at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:130)
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org