You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2017/11/17 16:45:04 UTC

[jira] [Resolved] (SPARK-22538) SQLTransformer.transform(inputDataFrame) uncaches inputDataFrame

     [ https://issues.apache.org/jira/browse/SPARK-22538?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-22538.
---------------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.0
                   2.2.2

Issue resolved by pull request 19772
[https://github.com/apache/spark/pull/19772]

> SQLTransformer.transform(inputDataFrame) uncaches inputDataFrame
> ----------------------------------------------------------------
>
>                 Key: SPARK-22538
>                 URL: https://issues.apache.org/jira/browse/SPARK-22538
>             Project: Spark
>          Issue Type: Bug
>          Components: ML, PySpark, SQL, Web UI
>    Affects Versions: 2.2.0
>            Reporter: MBA Learns to Code
>             Fix For: 2.2.2, 2.3.0
>
>
> When running the below code on PySpark v2.2.0, the cached input DataFrame df disappears from SparkUI after SQLTransformer.transform(...) is called on it.
> I don't yet know whether this is only a SparkUI bug, or the input DataFrame df is indeed unpersisted from memory. If the latter is true, this can be a serious bug because any new computation using new_df would have to re-do all the work leading up to df.
> {code}
> import pandas
> import pyspark
> from pyspark.ml.feature import SQLTransformer
> spark = pyspark.sql.SparkSession.builder.getOrCreate()
> df = spark.createDataFrame(pandas.DataFrame(dict(x=[-1, 0, 1])))
> # after below step, SparkUI Storage shows 1 cached RDD
> df.cache(); df.count()
> # after below step, cached RDD disappears from SparkUI Storage
> new_df = SQLTransformer(statement='SELECT * FROM __THIS__').transform(df)
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org