You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/08/27 19:58:46 UTC

[jira] [Resolved] (SPARK-10182) GeneralizedLinearModel doesn't unpersist cached data

     [ https://issues.apache.org/jira/browse/SPARK-10182?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-10182.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 1.6.0

Issue resolved by pull request 8395
[https://github.com/apache/spark/pull/8395]

> GeneralizedLinearModel doesn't unpersist cached data
> ----------------------------------------------------
>
>                 Key: SPARK-10182
>                 URL: https://issues.apache.org/jira/browse/SPARK-10182
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.4.1
>            Reporter: Vyacheslav Baranov
>            Assignee: Vyacheslav Baranov
>            Priority: Minor
>             Fix For: 1.6.0
>
>
> The problem might be reproduced in spark-shell with following code snippet:
> {code}
> import org.apache.spark.SparkContext
> import org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS
> import org.apache.spark.mllib.linalg.Vectors
> import org.apache.spark.mllib.regression.LabeledPoint
> val samples = Seq[LabeledPoint](
>   LabeledPoint(1.0, Vectors.dense(1.0, 0.0)),
>   LabeledPoint(1.0, Vectors.dense(0.0, 1.0)),
>   LabeledPoint(0.0, Vectors.dense(1.0, 1.0)),
>   LabeledPoint(0.0, Vectors.dense(0.0, 0.0))
> )
> val rdd = sc.parallelize(samples)
> for (i <- 0 until 10) {
>   val model = {
>     new LogisticRegressionWithLBFGS()
>       .setNumClasses(2)
>       .run(rdd)
>       .clearThreshold()
>   }
> }
> {code}
> After code execution there are 10 {{MapPartitionsRDD}} objects on "Storage" tab in Spark application UI.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org