You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rick Hillegas (JIRA)" <ji...@apache.org> on 2015/09/25 22:14:04 UTC

[jira] [Commented] (SPARK-6649) DataFrame created through SQLContext.jdbc() failed if columns table must be quoted

    [ https://issues.apache.org/jira/browse/SPARK-6649?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14908609#comment-14908609 ] 

Rick Hillegas commented on SPARK-6649:
--------------------------------------

Hi Fred,

The backtick syntax seems to be a feature of HiveQL according to this discussion on the developer list: http://apache-spark-developers-list.1001551.n3.nabble.com/column-identifiers-in-Spark-SQL-td14280.html

Thanks,
-Rick

> DataFrame created through SQLContext.jdbc() failed if columns table must be quoted
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-6649
>                 URL: https://issues.apache.org/jira/browse/SPARK-6649
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.0
>            Reporter: Frédéric Blanc
>            Priority: Minor
>
> If I want to import the content a table from oracle, that contains a column with name COMMENT (a reserved keyword), I cannot use a DataFrame that map all the columns of this table.
> {code:title=ddl.sql|borderStyle=solid}
> CREATE TABLE TEST_TABLE (
>     "COMMENT" VARCHAR2(10)
> );
> {code}
> {code:title=test.java|borderStyle=solid}
> SQLContext sqlContext = ...
> DataFrame df = sqlContext.jdbc(databaseURL, "TEST_TABLE");
> df.rdd();   // => failed if the table contains a column with a reserved keyword
> {code}
> The same problem can be encounter if reserved keyword are used on table name.
> The JDBCRDD scala class could be improved, if the columnList initializer append the double-quote for each column. (line : 225)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org