You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "fang fang chen (JIRA)" <ji...@apache.org> on 2015/08/25 11:50:46 UTC

[jira] [Updated] (SPARK-10220) org.apache.spark.sql.jdbc.JDBCRDD could not parse mysql table column named reserved word

     [ https://issues.apache.org/jira/browse/SPARK-10220?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

fang fang chen updated SPARK-10220:
-----------------------------------
    Attachment: SPARK-10220.patch

> org.apache.spark.sql.jdbc.JDBCRDD could not parse mysql table column named reserved word
> ----------------------------------------------------------------------------------------
>
>                 Key: SPARK-10220
>                 URL: https://issues.apache.org/jira/browse/SPARK-10220
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.0
>            Reporter: fang fang chen
>         Attachments: SPARK-10220.patch
>
>
> Reproduce steps:
>    var options: HashMap[String, String] = new HashMap
>     options.put("driver", "com.mysql.jdbc.Driver")
>     options.put("url", url_total)
>     options.put("dbtable", table)//one column named "desc"
>     options.put("lowerBound", lower_bound.toString())
>     options.put("upperBound", upper_bound.toString())
>     options.put("numPartitions", partitions.toString());
>     options.put("partitionColumn", id);
>     val jdbcDF = sqlContext.load("jdbc", options)
>     jdbcDF.save(output)
> Exception:
> 15/08/24 19:02:34 ERROR executor.Executor: Exception in task 0.3 in stage 0.0 (TID 3)
> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'desc,warning_stat,money_limit,real_name,region_lv1,region_lv2,region_lv3,region_' at line 1



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org