You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lijie Xu (JIRA)" <ji...@apache.org> on 2016/06/12 13:18:20 UTC
[jira] [Updated] (SPARK-15903) Support AllColumn expression in UDF
functions
[ https://issues.apache.org/jira/browse/SPARK-15903?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Lijie Xu updated SPARK-15903:
-----------------------------
Description:
Sometimes, we want to put all the columns in the UDF functions such as "concat()". We would like to use simple $"*" instead of listing each column as follow.
val df = sc.makeRDD(Array((1, "A"), (2, "B"), (3, "C"))).toDF("Id", "Name")
val result = df.select($"*", concat($"\*").as("UDF")) // failed
val result = df.select($"*", concat($"Id", $"Name").as("UDF")) // passed
result.show()
was:
Sometimes, we want to put all the columns in the UDF functions such as "concat()". We would like to use simple $"*" instead of listing each column as follow.
val df = sc.makeRDD(Array((1, "A"), (2, "B"), (3, "C"))).toDF("Id", "Name")
val result = df.select($"*", concat($"*").as("UDF")) // failed
val result = df.select($"*", concat($"Id", $"Name").as("UDF")) // passed
result.show()
> Support AllColumn expression in UDF functions
> ---------------------------------------------
>
> Key: SPARK-15903
> URL: https://issues.apache.org/jira/browse/SPARK-15903
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 1.6.1
> Reporter: Lijie Xu
> Priority: Minor
>
> Sometimes, we want to put all the columns in the UDF functions such as "concat()". We would like to use simple $"*" instead of listing each column as follow.
> val df = sc.makeRDD(Array((1, "A"), (2, "B"), (3, "C"))).toDF("Id", "Name")
> val result = df.select($"*", concat($"\*").as("UDF")) // failed
> val result = df.select($"*", concat($"Id", $"Name").as("UDF")) // passed
> result.show()
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org