You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maciej Szymkiewicz (Jira)" <ji...@apache.org> on 2020/01/26 02:20:00 UTC

[jira] [Updated] (SPARK-30629) cleanClosure on recursive call leads to node stack overflow

     [ https://issues.apache.org/jira/browse/SPARK-30629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Maciej Szymkiewicz updated SPARK-30629:
---------------------------------------
    Description: 
This problem surfaced while handling SPARK-22817. In theory there are tests, which cover that problem, but it seems like they have been dead for some reason.

Reproducible example

{code:r}
f <- function(x) {
  f(x)
}

newF <- cleanClosure(f)
{code}




  was:
This problem surfaced while handling SPARK-22817. In theory there are tests, which cover that problem, but it seems like they have been dead for some reason.

Reproducible example

{code:r}
f <- function(x) {
  f(x)
}

newF <- cleanClosure(f)
{code}


Just looking at the {{cleanClosure}} /  {{processClosure}} pair, that function that is being processed is not added to {{checkedFuncs}}.


> cleanClosure on recursive call leads to node stack overflow
> -----------------------------------------------------------
>
>                 Key: SPARK-30629
>                 URL: https://issues.apache.org/jira/browse/SPARK-30629
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 3.0.0
>            Reporter: Maciej Szymkiewicz
>            Priority: Major
>
> This problem surfaced while handling SPARK-22817. In theory there are tests, which cover that problem, but it seems like they have been dead for some reason.
> Reproducible example
> {code:r}
> f <- function(x) {
>   f(x)
> }
> newF <- cleanClosure(f)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org