You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel Davies (Jira)" <ji...@apache.org> on 2022/01/02 19:01:00 UTC

[jira] [Comment Edited] (SPARK-37799) Support of 'melt' function in spark

    [ https://issues.apache.org/jira/browse/SPARK-37799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17467685#comment-17467685 ] 

Daniel Davies edited comment on SPARK-37799 at 1/2/22, 7:00 PM:
----------------------------------------------------------------

Got it; thanks [~hyukjin.kwon] ! I'll send an email there shortly.


was (Author: JIRAUSER282609):
Got it; thanks [~hyukjin.kwon] !

> Support of 'melt' function in spark
> -----------------------------------
>
>                 Key: SPARK-37799
>                 URL: https://issues.apache.org/jira/browse/SPARK-37799
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Core
>    Affects Versions: 3.2.0
>            Reporter: Daniel Davies
>            Priority: Minor
>
> Hello,
> Un-pivoting a dataframe is currently supported in Pandas with the 'melt' function, but isn't available in spark. It's easy enough to produce this functionality from the functions module (e.g. such as the melt function in pandas-on-pyspark [here|https://github.com/apache/spark/blob/c92bd5cafe62ca5226176446735171cc877e805a/python/pyspark/pandas/frame.py#L9651]), but I was wondering whether a more native solution had been considered? It would make end-user code more lightweight at the very least; and I wonder whether it could be made more efficient than using the stack function/struct-array-explode functions.
> I'm happy to try and make a PR if this is something that might be useful within spark. No worries if not, the methods above work fine.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org