You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/10/08 05:45:00 UTC

[jira] [Commented] (SPARK-25599) Stateful aggregation in PySpark

    [ https://issues.apache.org/jira/browse/SPARK-25599?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16641373#comment-16641373 ] 

Hyukjin Kwon commented on SPARK-25599:
--------------------------------------

Are you proposong UDAF for Python side? Then it might be a duplicate of SPARK-10915

> Stateful aggregation in PySpark
> -------------------------------
>
>                 Key: SPARK-25599
>                 URL: https://issues.apache.org/jira/browse/SPARK-25599
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark
>    Affects Versions: 2.3.0
>            Reporter: Vincent Grosbois
>            Priority: Minor
>
> Hi!
>  
> From PySpark, I am trying to define a custom aggregator *that is accumulating state*. Is it possible in Spark 2.3 ?
> AFAIK, it is now possible to define a custom UDAF in PySpark since Spark 2.3 (cf [How to define and use a User-Defined Aggregate Function in Spark SQL?|https://stackoverflow.com/questions/32100973/how-to-define-and-use-a-user-defined-aggregate-function-in-spark-sql]), by calling {{pandas_udf}} with the {{PandasUDFType.GROUPED_AGG}} keyword.
> However given that it is just taking a function as a parameter I don't think it is possible to carry state around during the aggregation with this function.
> From Scala, I see it is possible to have stateful aggregation by either extending {{UserDefinedAggregateFunction}} or {{org.apache.spark.sql.expressions.Aggregator}} , but is there a similar thing I can do on python-side only?
> If no, is this planned in a future release?
> thanks!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org