You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/08/26 13:45:00 UTC
[jira] [Resolved] (SPARK-25013) JDBC urls with jdbc:mariadb don't
work as expected
[ https://issues.apache.org/jira/browse/SPARK-25013?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-25013.
----------------------------------
Resolution: Won't Fix
I wouldn't add this into Spark for now unless there's strong request from a community for now.
> JDBC urls with jdbc:mariadb don't work as expected
> --------------------------------------------------
>
> Key: SPARK-25013
> URL: https://issues.apache.org/jira/browse/SPARK-25013
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.1
> Reporter: Dieter Vekeman
> Priority: Minor
>
> When using the MariaDB JDBC driver, the JDBC connection url should be
> {code:java}
> jdbc:mariadb://localhost:3306/DB?user=someuser&password=somepassword
> {code}
> https://mariadb.com/kb/en/library/about-mariadb-connector-j/
> However this does not work well in Spark (see below)
> *Workaround*
> The MariaDB driver also supports using mysql which does work.
> The problem seems to have been described and identified in:
> https://jira.mariadb.org/browse/CONJ-421
> All works well with spark using connection string with {{"jdbc:mysql:..."}}, but not using {{"jdbc:mariadb:..."}} because MySQL dialect is then not used.
> when not used, defaut quote is {{"}}, not {{`}}
> So, some internal query generated by spark like {{SELECT `i`,`ip` FROM tmp}} will then be executed as {{SELECT "i","ip" FROM tmp}} with dataType previously retrieved, causing the exception
> The author of the comment says
> {quote}I'll make a pull request to spark so "jdbc:mariadb:" connection string can be handle{quote}
> Did the pull request get lost or should a new one be made?
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org