You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Allison Wang (Jira)" <ji...@apache.org> on 2023/10/03 20:55:00 UTC

[jira] [Updated] (SPARK-45401) Add a new method `cleanup` in the UDTF interface

     [ https://issues.apache.org/jira/browse/SPARK-45401?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Allison Wang updated SPARK-45401:
---------------------------------
    Summary: Add a new method `cleanup` in the UDTF interface  (was: Add a new method `cleanup` in UDTF interface)

> Add a new method `cleanup` in the UDTF interface
> ------------------------------------------------
>
>                 Key: SPARK-45401
>                 URL: https://issues.apache.org/jira/browse/SPARK-45401
>             Project: Spark
>          Issue Type: Sub-task
>          Components: PySpark
>    Affects Versions: 3.5.0, 4.0.0
>            Reporter: Allison Wang
>            Priority: Major
>
> Currently, the {{terminate}} method of a UDTF is always executed, regardless of whether the {{eval}} method calls are successful. This is problematic. We should execute {{terminate}} only when all {{eval}} calls succeed.
> But what if users wish to perform cleanup actions during UDTF execution, such as closing connections? One option is for users to embed a {{try...except}} logic within the {{eval}} call:
> {code:java}
> def eval(self, row: Any):
>   try:
>     run_code()
>   except Exception:
>     clean_up(){code}
> However, running this {{try...except}} block for every {{eval}} call can be expensive to run, potentially affecting the performance of UDTFs.
> To tackle this, we can introduce a new method in the UDTF interface that will be called regardless of the outcome. The logic would look like:
> {code:java}
> try:
>   eval()
>   terminate()
> finally:
>   cleanup(){code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org