You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/11/17 04:45:00 UTC

[jira] [Commented] (SPARK-22538) SQLTransformer.transform(inputDataFrame) uncaches inputDataFrame

    [ https://issues.apache.org/jira/browse/SPARK-22538?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16256468#comment-16256468 ] 

Apache Spark commented on SPARK-22538:
--------------------------------------

User 'viirya' has created a pull request for this issue:
https://github.com/apache/spark/pull/19772

> SQLTransformer.transform(inputDataFrame) uncaches inputDataFrame
> ----------------------------------------------------------------
>
>                 Key: SPARK-22538
>                 URL: https://issues.apache.org/jira/browse/SPARK-22538
>             Project: Spark
>          Issue Type: Bug
>          Components: ML, PySpark, SQL, Web UI
>    Affects Versions: 2.2.0
>            Reporter: MBA Learns to Code
>
> When running the below code on PySpark v2.2.0, the cached input DataFrame df disappears from SparkUI after SQLTransformer.transform(...) is called on it.
> I don't yet know whether this is only a SparkUI bug, or the input DataFrame df is indeed unpersisted from memory. If the latter is true, this can be a serious bug because any new computation using new_df would have to re-do all the work leading up to df.
> {code}
> import pandas
> import pyspark
> from pyspark.ml.feature import SQLTransformer
> spark = pyspark.sql.SparkSession.builder.getOrCreate()
> df = spark.createDataFrame(pandas.DataFrame(dict(x=[-1, 0, 1])))
> # after below step, SparkUI Storage shows 1 cached RDD
> df.cache(); df.count()
> # after below step, cached RDD disappears from SparkUI Storage
> new_df = SQLTransformer(statement='SELECT * FROM __THIS__').transform(df)
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org