You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "David Sabater (JIRA)" <ji...@apache.org> on 2015/07/02 13:32:04 UTC

[jira] [Commented] (SPARK-8616) SQLContext doesn't handle tricky column names when loading from JDBC

    [ https://issues.apache.org/jira/browse/SPARK-8616?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14611811#comment-14611811 ] 

David Sabater commented on SPARK-8616:
--------------------------------------

I would assume the error here is the lack of support for columns containing characters like " ,;{}() =" (This includes whitespaces which was my initial issue)
If we are ok restricting this we just need to improve the error message when the exception is raised.

I would suggest to revisit this in the Maillist to see what are the opinions out there.

> SQLContext doesn't handle tricky column names when loading from JDBC
> --------------------------------------------------------------------
>
>                 Key: SPARK-8616
>                 URL: https://issues.apache.org/jira/browse/SPARK-8616
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>         Environment: Ubuntu 14.04, Sqlite 3.8.7, Spark 1.4.0
>            Reporter: Gergely Svigruha
>
> Reproduce:
>  - create a table in a relational database (in my case sqlite) with a column name containing a space:
>  CREATE TABLE my_table (id INTEGER, "tricky column" TEXT);
>  - try to create a DataFrame using that table:
> sqlContext.read.format("jdbc").options(Map(
>   "url" -> "jdbs:sqlite:...",
>   "dbtable" -> "my_table")).load()
> java.sql.SQLException: [SQLITE_ERROR] SQL error or missing database (no such column: tricky)
> According to the SQL spec this should be valid:
> http://savage.net.au/SQL/sql-99.bnf.html#delimited%20identifier



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org