You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:34:37 UTC

[jira] [Resolved] (SPARK-17249) java.lang.IllegalStateException: Did not find registered driver with class org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper

     [ https://issues.apache.org/jira/browse/SPARK-17249?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-17249.
----------------------------------
    Resolution: Incomplete

> java.lang.IllegalStateException: Did not find registered driver with class org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper 
> -----------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-17249
>                 URL: https://issues.apache.org/jira/browse/SPARK-17249
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Graeme Edwards
>            Priority: Minor
>              Labels: bulk-closed
>
> This issue is a corner case relating to SPARK-14162 that isn't fixed by that change.
> It occurs when we:
> - Are using Oracle's ojdbc 
> - The driver is wrapping ojdbc with a DriverWrapper because it is added via the Spark class loader.
> - We don't specify an explicit "driver" property
> Then in /org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala (createConnectionFactory)
> The driver will get the driverClass as:
>  val driverClass: String = userSpecifiedDriverClass.getOrElse {
>       DriverManager.getDriver(url).getClass.getCanonicalName
>     }
> Which since the Driver is wrapped by a DriverWrapper will be "org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper"
> That gets passed to the Executor which will attempt to find a matching wrapper with the name "org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper". However the Executor is aware of the wrapping and will compare with the wrapped classname instead:
>   case d: DriverWrapper if d.wrapped.getClass.getCanonicalName == driverClass => d
> I think the fix is just to change the initialization of driverClass to also be aware that there might be a wrapper and if so pass the wrapped classname.
> The problem can be worked around by setting the driver property for the jdbc call:
> val props = new java.util.Properties()
> props.put("driver", "oracle.jdbc.OracleDriver")
> val result = sqlContext.read.jdbc(connectionString, query, props)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org