You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Will Chen (JIRA)" <ji...@apache.org> on 2015/05/30 10:10:18 UTC
[jira] [Updated] (SPARK-7967) cannot resolve 'count' given input
columns when using DataFrame.withColumn
[ https://issues.apache.org/jira/browse/SPARK-7967?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Will Chen updated SPARK-7967:
-----------------------------
Description:
Code:
val userDF = app_user_register_log.filter($"add_time" > startDay).filter($"add_time" < endDay)
.select("id").as("userReg")
.join(activeDF.as("ad"), $"userReg.id" === $"ad.uid")
.select("ad.uid","ad.clientVerion","ad.loc","ad.auth_status"
,"ad.channel","ad.bd_area","ad.mobile_area","ad.idcard_area")
.withColumn("count", $"count") // Exception came from this line
was:
Code:
val userDF = app_user_register_log.filter($"add_time" > startDay).filter($"add_time" < endDay)
.select("id").as("userReg")
.join(activeDF.as("ad"), $"userReg.id" === $"ad.uid")
.select("ad.uid","ad.clientVerion","ad.loc","ad.auth_status"
,"ad.channel","ad.bd_area","ad.mobile_area","ad.idcard_area")
.withColumn("count", $"count")
> cannot resolve 'count' given input columns when using DataFrame.withColumn
> --------------------------------------------------------------------------
>
> Key: SPARK-7967
> URL: https://issues.apache.org/jira/browse/SPARK-7967
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.3.0
> Environment: spark 1.3.0 standalone
> Reporter: Will Chen
> Labels: dataFrame, sparksql
>
> Code:
> val userDF = app_user_register_log.filter($"add_time" > startDay).filter($"add_time" < endDay)
> .select("id").as("userReg")
> .join(activeDF.as("ad"), $"userReg.id" === $"ad.uid")
> .select("ad.uid","ad.clientVerion","ad.loc","ad.auth_status"
> ,"ad.channel","ad.bd_area","ad.mobile_area","ad.idcard_area")
> .withColumn("count", $"count") // Exception came from this line
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org