You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ming Ma (JIRA)" <ji...@apache.org> on 2017/11/28 17:49:00 UTC

[jira] [Commented] (SPARK-15573) Backwards-compatible persistence for spark.ml

    [ https://issues.apache.org/jira/browse/SPARK-15573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16269140#comment-16269140 ] 

Ming Ma commented on SPARK-15573:
---------------------------------

This is probably off topic as it isn't related to Spark version change, but the ability to evolve features and models regularly seems quite important. Can anyone confirm if how the scenario mentioned in http://apache-spark-user-list.1001560.n3.nabble.com/Spark-ML-Compatibility-between-features-and-models-td30100.html is currently handled? Thanks.

> Backwards-compatible persistence for spark.ml
> ---------------------------------------------
>
>                 Key: SPARK-15573
>                 URL: https://issues.apache.org/jira/browse/SPARK-15573
>             Project: Spark
>          Issue Type: Improvement
>          Components: ML
>            Reporter: Joseph K. Bradley
>
> This JIRA is for imposing backwards-compatible persistence for the DataFrames-based API for MLlib.  I.e., we want to be able to load models saved in previous versions of Spark.  We will not require loading models saved in later versions of Spark.
> This requires:
> * Putting unit tests in place to check loading models from previous versions
> * Notifying all committers active on MLlib to be aware of this requirement in the future
> The unit tests could be written as in spark.mllib, where we essentially copied and pasted the save() code every time it changed.  This happens rarely, so it should be acceptable, though other designs are fine.
> Subtasks of this JIRA should cover checking and adding tests for existing cases, such as KMeansModel (whose format changed between 1.6 and 2.0).



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org