You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/08/03 11:53:20 UTC

[jira] [Commented] (SPARK-16874) CSV Reader : Can't resolve column name with a point

    [ https://issues.apache.org/jira/browse/SPARK-16874?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15405778#comment-15405778 ] 

Sean Owen commented on SPARK-16874:
-----------------------------------

Sounds related to if not a duplicate of things like https://issues.apache.org/jira/browse/SPARK-12988 I think you have to back-tick columns like this now? It's not necessarily true that behavior is the same across major releases.

> CSV Reader : Can't resolve column name with a point
> ---------------------------------------------------
>
>                 Key: SPARK-16874
>                 URL: https://issues.apache.org/jira/browse/SPARK-16874
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 2.0.0
>            Reporter: Abou Haydar Elias
>            Priority: Minor
>
> I was porting some code I had from spark 1.6.2 to 2.0.0 and while reading a CSV file, I've stumbled into the following error :
> {code} org.apache.spark.sql.AnalysisException: Unable to resolve row.names given [row.names, sbp, tobacco, ldl, adiposity, famhist, typea, obesity, alcohol, age, chd];
> {code}
> There is is of course a work around using a predefined schema but it's not very practical if there is lots of columns
> I've provided in the following link code snippets that can generate the error for spark 2.0 and another that show how it works perfectly with 1.6.2 using the spark-csv package : 
> https://gist.github.com/eliasah/5dfc6bc8ddcbe920311049e37a58855b 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org