You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2014/08/25 07:47:57 UTC

[jira] [Commented] (SPARK-2495) Ability to re-create ML models

    [ https://issues.apache.org/jira/browse/SPARK-2495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14108769#comment-14108769 ] 

Apache Spark commented on SPARK-2495:
-------------------------------------

User 'mengxr' has created a pull request for this issue:
https://github.com/apache/spark/pull/2112

> Ability to re-create ML models
> ------------------------------
>
>                 Key: SPARK-2495
>                 URL: https://issues.apache.org/jira/browse/SPARK-2495
>             Project: Spark
>          Issue Type: Improvement
>          Components: MLlib
>    Affects Versions: 1.0.1
>            Reporter: Alexander Albul
>            Assignee: Alexander Albul
>
> Hi everyone.
> Previously (prior to Spark 1.0) we was working with MLib like this:
> 1) Calculate model (costly operation)
> 2) Take model and collect it's fields like weights, intercept e.t.c.
> 3) Store model somewhere in our format
> 4) Do predictions by loading model attributes, creating new model and predicting using it.
> Now i see that model's constructors have *private* modifier and cannot be created from outside.
> If you want to hide implementation details and keep this constructor as "developer api", why not to create at least method, which will take weights, intercept (for example) an materialize that model?
> A good example of model that i am talking about is: *LinearRegressionModel*
> I know that *LinearRegressionWithSGD* class have *createModel* method but the problem is that it have *protected* modifier as well.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org