You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/12/06 21:08:58 UTC

[jira] [Commented] (SPARK-17760) DataFrame's pivot doesn't see column created in groupBy

    [ https://issues.apache.org/jira/browse/SPARK-17760?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15726715#comment-15726715 ] 

Apache Spark commented on SPARK-17760:
--------------------------------------

User 'aray' has created a pull request for this issue:
https://github.com/apache/spark/pull/16177

> DataFrame's pivot doesn't see column created in groupBy
> -------------------------------------------------------
>
>                 Key: SPARK-17760
>                 URL: https://issues.apache.org/jira/browse/SPARK-17760
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.0.0
>         Environment: Databrick's community version, spark 2.0.0, pyspark, python 2.
>            Reporter: Alberto Bonsanto
>              Labels: easytest, newbie
>
> Related to [https://stackoverflow.com/questions/39817993/pivoting-with-missing-values]. I'm not completely sure if this is a bug or expected behavior.
> When you `groypBy` by a column generated inside of it, the `pivot` method apparently doesn't find this column during the analysis.
> E.g.
> {code:none}
> df = (sc.parallelize([(1.0, "2016-03-30 01:00:00"), 
>                       (30.2, "2015-01-02 03:00:02")])
>         .toDF(["amount", "Date"])
>         .withColumn("Date", col("Date").cast("timestamp")))
> (df.withColumn("hour",hour("date"))
>    .groupBy(dayofyear("date").alias("date"))
>    .pivot("hour").sum("amount").show()){code}
> Shows the following exception.
> {quote}
> AnalysisException: u'resolved attribute(s) date#140688 missing from dayofyear(date)#140994,hour#140977,sum(`amount`)#140995 in operator !Aggregate \[dayofyear(cast(date#140688 as date))], [dayofyear(cast(date#140688 as date)) AS dayofyear(date)#140994, pivotfirst(hour#140977, sum(`amount`)#140995, 1, 3, 0, 0) AS __pivot_sum(`amount`) AS `sum(``amount``)`#141001\];'
> {quote}
> To solve it you have to add the column {{date}} before grouping and pivoting.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org