You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/01/02 00:45:00 UTC

[jira] [Commented] (SPARK-37799) Support of 'melt' function in spark

    [ https://issues.apache.org/jira/browse/SPARK-37799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17467523#comment-17467523 ] 

Hyukjin Kwon commented on SPARK-37799:
--------------------------------------

[~ddavies1], sounds a valid question but let's interact with dev (or use) mailing list first before filing it as a JIRA ticket here. I think mailing lists are better places to collect feedbacks and investigate the needs

> Support of 'melt' function in spark
> -----------------------------------
>
>                 Key: SPARK-37799
>                 URL: https://issues.apache.org/jira/browse/SPARK-37799
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Core
>    Affects Versions: 3.2.0
>            Reporter: Daniel Davies
>            Priority: Minor
>
> Hello,
> Un-pivoting a dataframe is currently supported in Pandas with the 'melt' function, but isn't available in spark. It's easy enough to produce this functionality from the functions module (e.g. such as the melt function in pandas-on-pyspark [here|https://github.com/apache/spark/blob/c92bd5cafe62ca5226176446735171cc877e805a/python/pyspark/pandas/frame.py#L9651]), but I was wondering whether a more native solution had been considered? It would make end-user code more lightweight at the very least; and I wonder whether it could be made more efficient than using the stack function/struct-array-explode functions.
> I'm happy to try and make a PR if this is something that might be useful within spark. No worries if not, the methods above work fine.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org