You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/01/20 19:54:27 UTC

[jira] [Assigned] (SPARK-18823) Assignation by column name variable not available or bug?

     [ https://issues.apache.org/jira/browse/SPARK-18823?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-18823:
------------------------------------

    Assignee:     (was: Apache Spark)

> Assignation by column name variable not available or bug?
> ---------------------------------------------------------
>
>                 Key: SPARK-18823
>                 URL: https://issues.apache.org/jira/browse/SPARK-18823
>             Project: Spark
>          Issue Type: Question
>          Components: SparkR
>    Affects Versions: 2.0.2
>         Environment: RStudio Server in EC2 Instances (EMR Service of AWS) Emr 4. Or databricks (community.cloud.databricks.com) .
>            Reporter: Vicente Masip
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> I really don't know if this is a bug or can be done with some function:
> Sometimes is very important to assign something to a column which name has to be access trough a variable. Normally, I have always used it with doble brackets likes this out of SparkR problems:
> # df could be faithful normal data frame or data table.
> # accesing by variable name:
> myname = "waiting"
> df[[myname]] <- c(1:nrow(df))
> # or even column number
> df[[2]] <- df$eruptions
> The error is not caused by the right side of the "<-" operator of assignment. The problem is that I can't assign to a column name using a variable or column number as I do in this examples out of spark. Doesn't matter if I am modifying or creating column. Same problem.
> I have also tried to use this with no results:
> val df2 = withColumn(df,"tmp", df$eruptions)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org