You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "PJ Fanning (JIRA)" <ji...@apache.org> on 2015/12/28 13:30:49 UTC
[jira] [Commented] (SPARK-9505) DataFrames : Mysql JDBC not support
column names with special characters
[ https://issues.apache.org/jira/browse/SPARK-9505?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15072680#comment-15072680 ]
PJ Fanning commented on SPARK-9505:
-----------------------------------
There is a pull request for SPARK-12437 that looks like it would fix this.
> DataFrames : Mysql JDBC not support column names with special characters
> ------------------------------------------------------------------------
>
> Key: SPARK-9505
> URL: https://issues.apache.org/jira/browse/SPARK-9505
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, SQL
> Affects Versions: 1.3.0
> Reporter: Pangjiu
> Priority: Blocker
>
> HI all,
> I had above issue on connect to mySQL database through SQLContext. If the mySQL table's column name contains special characters like #[ ] %, it throw exception : "You have an error in your SQL syntax".
> Below is coding:
> Class.forName("com.mysql.jdbc.Driver").newInstance()
> val url = "jdbc:mysql://localhost:3306/sakila?user=root&password=xxx"
> val driver = "com.mysql.jdbc.Driver"
> val sqlContext = new SQLContext(sc)
> val output = { sqlContext.load("jdbc", Map
> (
> "url" -> url,
> "driver" -> driver,
> "dbtable" -> "(SELECT `ID`, `NAME%` FROM `agent`) AS tableA "
> )
> )
> }
> Hope dataframes for sqlContext can support for special characters very soon as this become a stopper now.
> Thanks
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org