You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2016/01/04 19:40:39 UTC

[jira] [Resolved] (SPARK-12579) User-specified JDBC driver should always take precedence

     [ https://issues.apache.org/jira/browse/SPARK-12579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yin Huai resolved SPARK-12579.
------------------------------
       Resolution: Fixed
    Fix Version/s: 2.0.0

Issue resolved by pull request 10519
[https://github.com/apache/spark/pull/10519]

> User-specified JDBC driver should always take precedence
> --------------------------------------------------------
>
>                 Key: SPARK-12579
>                 URL: https://issues.apache.org/jira/browse/SPARK-12579
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>             Fix For: 2.0.0
>
>
> Spark SQL's JDBC data source allows users to specify an explicit JDBC driver to load using the {{driver}} argument, but in the current code it's possible that the user-specified driver will not be used when it comes time to actually create a JDBC connection.
> In a nutshell, the problem is that you might have multiple JDBC drivers on your classpath that claim to be able to handle the same subprotocol and there doesn't seem to be an intuitive way to control which of those drivers takes precedence.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org