You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joseph K. Bradley (JIRA)" <ji...@apache.org> on 2016/01/06 20:59:40 UTC

[jira] [Resolved] (SPARK-12006) GaussianMixture.train crashes if an initial model is not None

     [ https://issues.apache.org/jira/browse/SPARK-12006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Joseph K. Bradley resolved SPARK-12006.
---------------------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.2
                   1.6.1
                   1.5.3
                   2.0.0

Issue resolved by pull request 9986
[https://github.com/apache/spark/pull/9986]

> GaussianMixture.train crashes if an initial model is not None
> -------------------------------------------------------------
>
>                 Key: SPARK-12006
>                 URL: https://issues.apache.org/jira/browse/SPARK-12006
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib, PySpark
>    Affects Versions: 1.4.0, 1.5.0, 1.6.0
>            Reporter: Maciej Szymkiewicz
>            Assignee: Maciej Szymkiewicz
>             Fix For: 2.0.0, 1.5.3, 1.6.1, 1.4.2
>
>
> Steps to reproduce :
> {code}
> from pyspark.mllib.clustering import GaussianMixture
> from numpy import array
> data = sc.textFile("data/mllib/gmm_data.txt")
> parsedData = data.map(lambda line: array([float(x) for x in line.strip().split(' ')]))
> gmm = GaussianMixture.train(parsedData, 2)
> GaussianMixture.train(parsedData, 2, initialModel=gmm)
> {code}
> It looks like the source of the problem is [{{initialModelWeights}}|https://github.com/apache/spark/blob/branch-1.6/python/pyspark/mllib/clustering.py#L349] NumPy array. In 1.5 / 1.6 it leads to {{net.razorvine.pickle.PickleException}}, in 1.4 we get {{Method trainGaussianMixture(\[..., class org.apache.spark.mllib.linalg.DenseVector, class java.util.ArrayList, class java.util.ArrayList\]) does not exist}}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org