You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Behrouz Derakhshan <be...@gmail.com> on 2016/04/08 16:04:46 UTC

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Is there a reasons the Predictor or Estimator class don't have read and
write methods for saving and retrieving the model? I couldn't find Jira
issues for it. Does it make sense to create one ?

BR,
Behrouz

On Wed, Mar 30, 2016 at 4:40 PM, Till Rohrmann <tr...@apache.org> wrote:

> Yes Suneel is completely wright. If the data does not implement
> IOReadableWritable it is probably easier to use the
> TypeSerializerOutputFormat. What you need here to seralize the data is a
> TypeSerializer. You can obtain it the following way:
>
> val model = mlr.weightsOption.get
>
> val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
> val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new ExecutionConfig())
> val outputFormat = new TypeSerializerOutputFormat[WeightVector]
> outputFormat.setSerializer(weightVectorSerializer)
>
> model.write(outputFormat, "path")
>
> Cheers,
> Till
> ​
>
> On Tue, Mar 29, 2016 at 8:22 PM, Suneel Marthi <sm...@apache.org> wrote:
>
>> U may want to use FlinkMLTools.persist() methods which use
>> TypeSerializerFormat and don't enforce IOReadableWritable.
>>
>>
>>
>> On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <
>> gna.phetsarath@teamaol.com> wrote:
>>
>>> Till,
>>>
>>> Thank you for your reply.
>>>
>>> Having this issue though, WeightVector does not extend IOReadWriteable:
>>>
>>> *public* *class* SerializedOutputFormat<*T* *extends* IOReadableWritable
>>> >
>>>
>>> *case* *class* WeightVector(weights: Vector, intercept: Double)
>>> *extends* Serializable {}
>>>
>>>
>>> However, I will use the approach to write out the weights as text.
>>>
>>>
>>> On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann <tr...@apache.org>
>>> wrote:
>>>
>>>> Hi Gna,
>>>>
>>>> there are no utilities yet to do that but you can do it manually. In
>>>> the end, a model is simply a Flink DataSet which you can serialize to
>>>> some file. Upon reading this DataSet you simply have to give it to
>>>> your algorithm to be used as the model. The following code snippet
>>>> illustrates this approach:
>>>>
>>>> mlr.fit(inputDS, parameters)
>>>>
>>>> // write model to disk using the SerializedOutputFormat
>>>> mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector], "path")
>>>>
>>>> // read the serialized model from disk
>>>> val model = env.readFile(new SerializedInputFormat[WeightVector], "path")
>>>>
>>>> // set the read model for the MLR algorithm
>>>> mlr.weightsOption = model
>>>>
>>>> Cheers,
>>>> Till
>>>> ​
>>>>
>>>> On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <
>>>> simone.robutti@radicalbit.io> wrote:
>>>>
>>>>> To my knowledge there is nothing like that. PMML is not supported in
>>>>> any form and there's no custom saving format yet. If you really need a
>>>>> quick and dirty solution, it's not that hard to serialize the model into a
>>>>> file.
>>>>>
>>>>> 2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <
>>>>> gna.phetsarath@teamaol.com>:
>>>>>
>>>>>> Flinksters,
>>>>>>
>>>>>> Is there an example of saving a Trained Model, loading a Trained
>>>>>> Model and then scoring one or more feature vectors using Flink ML?
>>>>>>
>>>>>> All of the examples I've seen have shown only sequential fit and
>>>>>> predict.
>>>>>>
>>>>>> Thank you.
>>>>>>
>>>>>> -Gna
>>>>>> --
>>>>>>
>>>>>>
>>>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services
>>>>>> // Applied Research Chapter
>>>>>> 770 Broadway, 5th Floor, New York, NY 10003
>>>>>> o: 212.402.4871 // m: 917.373.7363
>>>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>>>>
>>>>>> * <http://www.aolplatforms.com>*
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>>
>>>
>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services //
>>> Applied Research Chapter
>>> 770 Broadway, 5th Floor, New York, NY 10003
>>> o: 212.402.4871 // m: 917.373.7363
>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>
>>> * <http://www.aolplatforms.com>*
>>>
>>
>>
>

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Posted by KirstiLaurila <ki...@rovio.com>.
Answering to myself if someone is having similar problems. So already saved
matrices can be read and used in als like this:

    
    // Setup the ALS learnerd
    val als = ALS()

    val users  = env.readFile(new
TypeSerializerInputFormat[Factors](createTypeInformation[Factors]),"path")
    val items = env.readFile(new
TypeSerializerInputFormat[Factors](createTypeInformation[Factors]),"path")
    

    als.factorsOption = Option(users,items)

After this, one can use als for prediction.





--
View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-ML-1-0-0-Saving-and-Loading-Models-to-Score-a-Single-Feature-Vector-tp5766p6167.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Posted by KirstiLaurila <ki...@rovio.com>.
Now I got this working in cloud (not locally, but it's ok) so thanks a lot.
Next problem is how to read then these written files and add them to the
als.

I guess it is something like 

   val als = ALS()
   als.factorsOption = Option(users,items)

but I don't get how I could read in the data I have written with the
previous example. I tried with :

    val users  = env.readFile(new SerializedInputFormat[Factors], "path")

but I guess I need to use somehow TypeSerializedInputFormat[Factors] but I
couldn't get this working.

Best,
Kirsti



--
View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-ML-1-0-0-Saving-and-Loading-Models-to-Score-a-Single-Feature-Vector-tp5766p6081.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Posted by Till Rohrmann <tr...@apache.org>.
Sorry, I had a mistake in my example code. I thought the model would be
stored as a (Option[DataSet[Factors]], Option[DataSet[Factors]]) but
instead it’s stored as Option[(DataSet[Factors], DataSet[Factors])].

So the code should be

val als = ALS()

als.fit(input)

val alsModelOpt = als.factorsOption

val factorsTypeInfo = TypeInformation.of(classOf[Factors])
val factorsSerializer = factorsTypeInfo.createSerializer(new ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[Factors]
outputFormat.setSerializer(factorsSerializer)

alsModelOpt match {
    case Some((userFactors, itemFactors)) =>
        userFactors.write(outputFormat, "user_path")
        itemFactors.write(outputFormat, "item_path")
    case None =>
}

if I’m not mistaken.

If you don’t see any output, then it might be the case that your model is
empty. Could you check that? You could for example simply call print on the
model DataSet.

Do you call env.execute at the end of your program? If you don’t do that,
then the job is not executed.

Cheers,
Till
​

On Tue, Apr 12, 2016 at 1:25 PM, KirstiLaurila <ki...@rovio.com>
wrote:

> Hi,
>
> those parts were examples how I had tried. I tried with your suggestions,
> but still no success. Additionally,
> there were some problems:
>
>
> val (userFactorsOpt, itemFactorsOpt) = als.factorsOption
>
> If I had just this, userFactorsOpt And itemFactorsOpt did not have write
> method. So I added get there i.e.
>
> val (userFactorsOpt, itemFactorsOpt) = als.factorsOption.get
>
>
> val factorsTypeInfo = TypeInformation.of(classOf[Factors])
> val factorsSerializer = factorsTypeInfo.createSerializer(new
> ExecutionConfig())
> val outputFormat = new TypeSerializerOutputFormat[Factors]
>
>
> Here, the factorsSerializer was not used at all, so I guess this was
> missing
> line
>
>     outputFormat.setSerializer(factorsSerializer)
>
>
> userFactorsOpt match {
>     case Some(userFactors) => userFactors.write(outputFormat, "user_path")
>     case None =>
> }
>
>
> This doesn't run because of error message
>
> Error:(71, 12) constructor cannot be instantiated to expected type;
>  found   : Some[A]
>  required:
>
> org.apache.flink.api.scala.DataSet[org.apache.flink.ml.recommendation.ALS.Factors]
>       case Some(userFactors) => userFactorsOpt.write(outputFormat,
> "path_to_my_file")
>
> However, I still tried not to have match case i.e.
>
>     userFactorsOpt.write(outputFormat, "path")
>
> but nothing was written anywhere.
>
>
>
>
>
> --
> View this message in context:
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-ML-1-0-0-Saving-and-Loading-Models-to-Score-a-Single-Feature-Vector-tp5766p6059.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive
> at Nabble.com.
>

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Posted by KirstiLaurila <ki...@rovio.com>.
Hi, 

those parts were examples how I had tried. I tried with your suggestions,
but still no success. Additionally, 
there were some problems: 


val (userFactorsOpt, itemFactorsOpt) = als.factorsOption 

If I had just this, userFactorsOpt And itemFactorsOpt did not have write
method. So I added get there i.e.

val (userFactorsOpt, itemFactorsOpt) = als.factorsOption.get 


val factorsTypeInfo = TypeInformation.of(classOf[Factors])
val factorsSerializer = factorsTypeInfo.createSerializer(new
ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[Factors]


Here, the factorsSerializer was not used at all, so I guess this was missing
line 

    outputFormat.setSerializer(factorsSerializer)


userFactorsOpt match {
    case Some(userFactors) => userFactors.write(outputFormat, "user_path")
    case None =>
}


This doesn't run because of error message 

Error:(71, 12) constructor cannot be instantiated to expected type;
 found   : Some[A]
 required:
org.apache.flink.api.scala.DataSet[org.apache.flink.ml.recommendation.ALS.Factors]
      case Some(userFactors) => userFactorsOpt.write(outputFormat,
"path_to_my_file")

However, I still tried not to have match case i.e.

    userFactorsOpt.write(outputFormat, "path")
    
but nothing was written anywhere.





--
View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-ML-1-0-0-Saving-and-Loading-Models-to-Score-a-Single-Feature-Vector-tp5766p6059.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Posted by Till Rohrmann <tr...@apache.org>.
Hi Kirsti,

I think you attached some images to your file which show the code.
Unfortunately this is not supported by the mailing list. So maybe you could
resend what you’ve already tried.

In order to access the ALS model, you can do the following:

val als = ALS()

als.fit(input)

val (userFactorsOpt, itemFactorsOpt) = als.factorsOption

val factorsTypeInfo = TypeInformation.of(classOf[Factors])
val factorsSerializer = factorsTypeInfo.createSerializer(new ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[Factors]

userFactorsOpt match {
    case Some(userFactors) => userFactors.write(outputFormat, "user_path")
    case None =>
}

itemFactorsOpt match {
    case Some(itemFactors) => itemFactors.write(outputFormat, "item_path")
    case None =>
}

Cheers,
Till
​

On Tue, Apr 12, 2016 at 10:29 AM, KirstiLaurila <ki...@rovio.com>
wrote:

> How should this be done for the recommendation engine (that is ALS, example
> here
>
> https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/batch/libs/ml/als.html
> <
> https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/batch/libs/ml/als.html
> >
> ).
>
>  I am able to run the example with my example data but cannot get anything
> written to any file (user or item matrices).
>
> Basically, I have tried something like this
>
>
>
>
> Tried also to apply similar approach than this
>
>
>
> but with no success. Could someone help me with this to get my model saved?
>
>
> Best,
> Kirsti
>
>
>
> Trevor Grant wrote
> > I'm just about to open an issue / PR solution for 'warm-starts'
> >
> > Once this is in, we could just add a setter for the weight vector (and
> > what
> > ever iteration you're on if you're going to do more partial fits).
> >
> > Then all you need to save if your weight vector (and iter number).
> >
> >
> >
> > Trevor Grant
> > Data Scientist
> > https://github.com/rawkintrevo
> > http://stackexchange.com/users/3002022/rawkintrevo
> > http://trevorgrant.org
> >
> > *"Fortunate is he, who is able to know the causes of things."  -Virgil*
> >
> >
> > On Fri, Apr 8, 2016 at 9:04 AM, Behrouz Derakhshan <
>
> > behrouz.derakhshan@
>
> >> wrote:
> >
> >> Is there a reasons the Predictor or Estimator class don't have read and
> >> write methods for saving and retrieving the model? I couldn't find Jira
> >> issues for it. Does it make sense to create one ?
> >>
> >> BR,
> >> Behrouz
> >>
> >> On Wed, Mar 30, 2016 at 4:40 PM, Till Rohrmann &lt;
>
> > trohrmann@
>
> > &gt;
> >> wrote:
> >>
> >>> Yes Suneel is completely wright. If the data does not implement
> >>> IOReadableWritable it is probably easier to use the
> >>> TypeSerializerOutputFormat. What you need here to seralize the data is
> a
> >>> TypeSerializer. You can obtain it the following way:
> >>>
> >>> val model = mlr.weightsOption.get
> >>>
> >>> val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
> >>> val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new
> >>> ExecutionConfig())
> >>> val outputFormat = new TypeSerializerOutputFormat[WeightVector]
> >>> outputFormat.setSerializer(weightVectorSerializer)
> >>>
> >>> model.write(outputFormat, "path")
> >>>
> >>> Cheers,
> >>> Till
> >>> ​
> >>>
> >>> On Tue, Mar 29, 2016 at 8:22 PM, Suneel Marthi &lt;
>
> > smarthi@
>
> > &gt;
> >>> wrote:
> >>>
> >>>> U may want to use FlinkMLTools.persist() methods which use
> >>>> TypeSerializerFormat and don't enforce IOReadableWritable.
> >>>>
> >>>>
> >>>>
> >>>> On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <
> >>>>
>
> > gna.phetsarath@
>
> >> wrote:
> >>>>
> >>>>> Till,
> >>>>>
> >>>>> Thank you for your reply.
> >>>>>
> >>>>> Having this issue though, WeightVector does not extend
> >>>>> IOReadWriteable:
> >>>>>
> >>>>> *public* *class* SerializedOutputFormat<*T* *extends*
> >>>>> IOReadableWritable>
> >>>>>
> >>>>> *case* *class* WeightVector(weights: Vector, intercept: Double)
> >>>>> *extends* Serializable {}
> >>>>>
> >>>>>
> >>>>> However, I will use the approach to write out the weights as text.
> >>>>>
> >>>>>
> >>>>> On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann &lt;
>
> > trohrmann@
>
> > &gt;
> >>>>> wrote:
> >>>>>
> >>>>>> Hi Gna,
> >>>>>>
> >>>>>> there are no utilities yet to do that but you can do it manually. In
> >>>>>> the end, a model is simply a Flink DataSet which you can serialize
> to
> >>>>>> some file. Upon reading this DataSet you simply have to give it to
> >>>>>> your algorithm to be used as the model. The following code snippet
> >>>>>> illustrates this approach:
> >>>>>>
> >>>>>> mlr.fit(inputDS, parameters)
> >>>>>>
> >>>>>> // write model to disk using the SerializedOutputFormat
> >>>>>> mlr.weightsOption.get.write(new
> SerializedOutputFormat[WeightVector],
> >>>>>> "path")
> >>>>>>
> >>>>>> // read the serialized model from disk
> >>>>>> val model = env.readFile(new SerializedInputFormat[WeightVector],
> >>>>>> "path")
> >>>>>>
> >>>>>> // set the read model for the MLR algorithm
> >>>>>> mlr.weightsOption = model
> >>>>>>
> >>>>>> Cheers,
> >>>>>> Till
> >>>>>> ​
> >>>>>>
> >>>>>> On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <
> >>>>>>
>
> > simone.robutti@
>
> >> wrote:
> >>>>>>
> >>>>>>> To my knowledge there is nothing like that. PMML is not supported
> in
> >>>>>>> any form and there's no custom saving format yet. If you really
> need
> >>>>>>> a
> >>>>>>> quick and dirty solution, it's not that hard to serialize the model
> >>>>>>> into a
> >>>>>>> file.
> >>>>>>>
> >>>>>>> 2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <
> >>>>>>>
>
> > gna.phetsarath@
>
> >>:
> >>>>>>>
> >>>>>>>> Flinksters,
> >>>>>>>>
> >>>>>>>> Is there an example of saving a Trained Model, loading a Trained
> >>>>>>>> Model and then scoring one or more feature vectors using Flink ML?
> >>>>>>>>
> >>>>>>>> All of the examples I've seen have shown only sequential fit and
> >>>>>>>> predict.
> >>>>>>>>
> >>>>>>>> Thank you.
> >>>>>>>>
> >>>>>>>> -Gna
> >>>>>>>> --
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services
> >>>>>>>> // Applied Research Chapter
> >>>>>>>> 770 Broadway, 5th Floor, New York, NY 10003
> >>>>>>>> o: 212.402.4871 // m: 917.373.7363
> >>>>>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
> >>>>>>>>
> >>>>>>>> * &lt;http://www.aolplatforms.com&gt;*
> >>>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>
> >>>>>
> >>>>>
> >>>>> --
> >>>>>
> >>>>>
> >>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services //
> >>>>> Applied Research Chapter
> >>>>> 770 Broadway, 5th Floor, New York, NY 10003
> >>>>> o: 212.402.4871 // m: 917.373.7363
> >>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
> >>>>>
> >>>>> * &lt;http://www.aolplatforms.com&gt;*
> >>>>>
> >>>>
> >>>>
> >>>
> >>
>
>
>
>
>
> --
> View this message in context:
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-ML-1-0-0-Saving-and-Loading-Models-to-Score-a-Single-Feature-Vector-tp5766p6056.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive
> at Nabble.com.
>

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Posted by KirstiLaurila <ki...@rovio.com>.
How should this be done for the recommendation engine (that is ALS, example
here 
https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/batch/libs/ml/als.html
<https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/batch/libs/ml/als.html> 
).

 I am able to run the example with my example data but cannot get anything
written to any file (user or item matrices). 

Basically, I have tried something like this


    

Tried also to apply similar approach than this 



but with no success. Could someone help me with this to get my model saved?


Best,
Kirsti



Trevor Grant wrote
> I'm just about to open an issue / PR solution for 'warm-starts'
> 
> Once this is in, we could just add a setter for the weight vector (and
> what
> ever iteration you're on if you're going to do more partial fits).
> 
> Then all you need to save if your weight vector (and iter number).
> 
> 
> 
> Trevor Grant
> Data Scientist
> https://github.com/rawkintrevo
> http://stackexchange.com/users/3002022/rawkintrevo
> http://trevorgrant.org
> 
> *"Fortunate is he, who is able to know the causes of things."  -Virgil*
> 
> 
> On Fri, Apr 8, 2016 at 9:04 AM, Behrouz Derakhshan <

> behrouz.derakhshan@

>> wrote:
> 
>> Is there a reasons the Predictor or Estimator class don't have read and
>> write methods for saving and retrieving the model? I couldn't find Jira
>> issues for it. Does it make sense to create one ?
>>
>> BR,
>> Behrouz
>>
>> On Wed, Mar 30, 2016 at 4:40 PM, Till Rohrmann &lt;

> trohrmann@

> &gt;
>> wrote:
>>
>>> Yes Suneel is completely wright. If the data does not implement
>>> IOReadableWritable it is probably easier to use the
>>> TypeSerializerOutputFormat. What you need here to seralize the data is a
>>> TypeSerializer. You can obtain it the following way:
>>>
>>> val model = mlr.weightsOption.get
>>>
>>> val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
>>> val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new
>>> ExecutionConfig())
>>> val outputFormat = new TypeSerializerOutputFormat[WeightVector]
>>> outputFormat.setSerializer(weightVectorSerializer)
>>>
>>> model.write(outputFormat, "path")
>>>
>>> Cheers,
>>> Till
>>> ​
>>>
>>> On Tue, Mar 29, 2016 at 8:22 PM, Suneel Marthi &lt;

> smarthi@

> &gt;
>>> wrote:
>>>
>>>> U may want to use FlinkMLTools.persist() methods which use
>>>> TypeSerializerFormat and don't enforce IOReadableWritable.
>>>>
>>>>
>>>>
>>>> On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <
>>>> 

> gna.phetsarath@

>> wrote:
>>>>
>>>>> Till,
>>>>>
>>>>> Thank you for your reply.
>>>>>
>>>>> Having this issue though, WeightVector does not extend
>>>>> IOReadWriteable:
>>>>>
>>>>> *public* *class* SerializedOutputFormat<*T* *extends*
>>>>> IOReadableWritable>
>>>>>
>>>>> *case* *class* WeightVector(weights: Vector, intercept: Double)
>>>>> *extends* Serializable {}
>>>>>
>>>>>
>>>>> However, I will use the approach to write out the weights as text.
>>>>>
>>>>>
>>>>> On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann &lt;

> trohrmann@

> &gt;
>>>>> wrote:
>>>>>
>>>>>> Hi Gna,
>>>>>>
>>>>>> there are no utilities yet to do that but you can do it manually. In
>>>>>> the end, a model is simply a Flink DataSet which you can serialize to
>>>>>> some file. Upon reading this DataSet you simply have to give it to
>>>>>> your algorithm to be used as the model. The following code snippet
>>>>>> illustrates this approach:
>>>>>>
>>>>>> mlr.fit(inputDS, parameters)
>>>>>>
>>>>>> // write model to disk using the SerializedOutputFormat
>>>>>> mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector],
>>>>>> "path")
>>>>>>
>>>>>> // read the serialized model from disk
>>>>>> val model = env.readFile(new SerializedInputFormat[WeightVector],
>>>>>> "path")
>>>>>>
>>>>>> // set the read model for the MLR algorithm
>>>>>> mlr.weightsOption = model
>>>>>>
>>>>>> Cheers,
>>>>>> Till
>>>>>> ​
>>>>>>
>>>>>> On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <
>>>>>> 

> simone.robutti@

>> wrote:
>>>>>>
>>>>>>> To my knowledge there is nothing like that. PMML is not supported in
>>>>>>> any form and there's no custom saving format yet. If you really need
>>>>>>> a
>>>>>>> quick and dirty solution, it's not that hard to serialize the model
>>>>>>> into a
>>>>>>> file.
>>>>>>>
>>>>>>> 2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <
>>>>>>> 

> gna.phetsarath@

>>:
>>>>>>>
>>>>>>>> Flinksters,
>>>>>>>>
>>>>>>>> Is there an example of saving a Trained Model, loading a Trained
>>>>>>>> Model and then scoring one or more feature vectors using Flink ML?
>>>>>>>>
>>>>>>>> All of the examples I've seen have shown only sequential fit and
>>>>>>>> predict.
>>>>>>>>
>>>>>>>> Thank you.
>>>>>>>>
>>>>>>>> -Gna
>>>>>>>> --
>>>>>>>>
>>>>>>>>
>>>>>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services
>>>>>>>> // Applied Research Chapter
>>>>>>>> 770 Broadway, 5th Floor, New York, NY 10003
>>>>>>>> o: 212.402.4871 // m: 917.373.7363
>>>>>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>>>>>>
>>>>>>>> * &lt;http://www.aolplatforms.com&gt;*
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>>
>>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services //
>>>>> Applied Research Chapter
>>>>> 770 Broadway, 5th Floor, New York, NY 10003
>>>>> o: 212.402.4871 // m: 917.373.7363
>>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>>>
>>>>> * &lt;http://www.aolplatforms.com&gt;*
>>>>>
>>>>
>>>>
>>>
>>





--
View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-ML-1-0-0-Saving-and-Loading-Models-to-Score-a-Single-Feature-Vector-tp5766p6056.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Posted by Trevor Grant <tr...@gmail.com>.
I'm just about to open an issue / PR solution for 'warm-starts'

Once this is in, we could just add a setter for the weight vector (and what
ever iteration you're on if you're going to do more partial fits).

Then all you need to save if your weight vector (and iter number).



Trevor Grant
Data Scientist
https://github.com/rawkintrevo
http://stackexchange.com/users/3002022/rawkintrevo
http://trevorgrant.org

*"Fortunate is he, who is able to know the causes of things."  -Virgil*


On Fri, Apr 8, 2016 at 9:04 AM, Behrouz Derakhshan <
behrouz.derakhshan@gmail.com> wrote:

> Is there a reasons the Predictor or Estimator class don't have read and
> write methods for saving and retrieving the model? I couldn't find Jira
> issues for it. Does it make sense to create one ?
>
> BR,
> Behrouz
>
> On Wed, Mar 30, 2016 at 4:40 PM, Till Rohrmann <tr...@apache.org>
> wrote:
>
>> Yes Suneel is completely wright. If the data does not implement
>> IOReadableWritable it is probably easier to use the
>> TypeSerializerOutputFormat. What you need here to seralize the data is a
>> TypeSerializer. You can obtain it the following way:
>>
>> val model = mlr.weightsOption.get
>>
>> val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
>> val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new ExecutionConfig())
>> val outputFormat = new TypeSerializerOutputFormat[WeightVector]
>> outputFormat.setSerializer(weightVectorSerializer)
>>
>> model.write(outputFormat, "path")
>>
>> Cheers,
>> Till
>> ​
>>
>> On Tue, Mar 29, 2016 at 8:22 PM, Suneel Marthi <sm...@apache.org>
>> wrote:
>>
>>> U may want to use FlinkMLTools.persist() methods which use
>>> TypeSerializerFormat and don't enforce IOReadableWritable.
>>>
>>>
>>>
>>> On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <
>>> gna.phetsarath@teamaol.com> wrote:
>>>
>>>> Till,
>>>>
>>>> Thank you for your reply.
>>>>
>>>> Having this issue though, WeightVector does not extend IOReadWriteable:
>>>>
>>>> *public* *class* SerializedOutputFormat<*T* *extends*
>>>> IOReadableWritable>
>>>>
>>>> *case* *class* WeightVector(weights: Vector, intercept: Double)
>>>> *extends* Serializable {}
>>>>
>>>>
>>>> However, I will use the approach to write out the weights as text.
>>>>
>>>>
>>>> On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann <tr...@apache.org>
>>>> wrote:
>>>>
>>>>> Hi Gna,
>>>>>
>>>>> there are no utilities yet to do that but you can do it manually. In
>>>>> the end, a model is simply a Flink DataSet which you can serialize to
>>>>> some file. Upon reading this DataSet you simply have to give it to
>>>>> your algorithm to be used as the model. The following code snippet
>>>>> illustrates this approach:
>>>>>
>>>>> mlr.fit(inputDS, parameters)
>>>>>
>>>>> // write model to disk using the SerializedOutputFormat
>>>>> mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector], "path")
>>>>>
>>>>> // read the serialized model from disk
>>>>> val model = env.readFile(new SerializedInputFormat[WeightVector], "path")
>>>>>
>>>>> // set the read model for the MLR algorithm
>>>>> mlr.weightsOption = model
>>>>>
>>>>> Cheers,
>>>>> Till
>>>>> ​
>>>>>
>>>>> On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <
>>>>> simone.robutti@radicalbit.io> wrote:
>>>>>
>>>>>> To my knowledge there is nothing like that. PMML is not supported in
>>>>>> any form and there's no custom saving format yet. If you really need a
>>>>>> quick and dirty solution, it's not that hard to serialize the model into a
>>>>>> file.
>>>>>>
>>>>>> 2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <
>>>>>> gna.phetsarath@teamaol.com>:
>>>>>>
>>>>>>> Flinksters,
>>>>>>>
>>>>>>> Is there an example of saving a Trained Model, loading a Trained
>>>>>>> Model and then scoring one or more feature vectors using Flink ML?
>>>>>>>
>>>>>>> All of the examples I've seen have shown only sequential fit and
>>>>>>> predict.
>>>>>>>
>>>>>>> Thank you.
>>>>>>>
>>>>>>> -Gna
>>>>>>> --
>>>>>>>
>>>>>>>
>>>>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services
>>>>>>> // Applied Research Chapter
>>>>>>> 770 Broadway, 5th Floor, New York, NY 10003
>>>>>>> o: 212.402.4871 // m: 917.373.7363
>>>>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>>>>>
>>>>>>> * <http://www.aolplatforms.com>*
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>>
>>>>
>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services //
>>>> Applied Research Chapter
>>>> 770 Broadway, 5th Floor, New York, NY 10003
>>>> o: 212.402.4871 // m: 917.373.7363
>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>>
>>>> * <http://www.aolplatforms.com>*
>>>>
>>>
>>>
>>
>