You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Neil Dewar (JIRA)" <ji...@apache.org> on 2016/07/15 03:10:20 UTC

[jira] [Closed] (SPARK-16466) names() function allows creation of column name containing "-". filter() function subsequently fails

     [ https://issues.apache.org/jira/browse/SPARK-16466?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Neil Dewar closed SPARK-16466.
------------------------------
       Resolution: Fixed
    Fix Version/s: 1.6.2

per Jongjoon, issue does not occur in 1.6.2

> names() function allows creation of column name containing "-".  filter() function subsequently fails
> -----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16466
>                 URL: https://issues.apache.org/jira/browse/SPARK-16466
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 1.6.1
>         Environment: Databricks.com
>            Reporter: Neil Dewar
>            Priority: Minor
>             Fix For: 1.6.2
>
>
> If I assign names to a DataFrame using the names() function, it allows the introduction of "-" characters that caused the filter() function to subsequently fail.  I am unclear if other special characters cause similar problems.
> Example:
> sdfCar <- createDataFrame(sqlContext, mtcars)
> names(sdfCar) <- c("mpg", "cyl", "disp", "hp", "drat", "wt", "qsec", "vs", "am", "gear", "carb-count") # note: carb renamed to carb-count
> sdfCar3 <- filter(sdfCar, carb-count==4)
> Above fails with error: failure: identifier expected carb-count==4.  This logic appears to be assuming that the "-" in the column name is a minus sign.
> I am unsure if the problem here is that "-" is illegal in a column name, or if the filter function should be able to handle "-" in a column name, but one or the other must be wrong.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org