You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Anant Daksh Asthana (JIRA)" <ji...@apache.org> on 2014/11/01 07:57:33 UTC

[jira] [Commented] (SPARK-4127) Streaming Linear Regression- Python bindings

    [ https://issues.apache.org/jira/browse/SPARK-4127?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14192996#comment-14192996 ] 

Anant Daksh Asthana commented on SPARK-4127:
--------------------------------------------

[~mengxr][~freeman-lab] I am running into some issues. Wondering if you could help.
I have pushed some changes to my branch https://github.com/anantasty/spark/tree/SPARK-4127
I added functions to the PythonMLLibAPI.scala
and to the python/pyspark/mllib/regression.py

I added an example similar to the scala one.

When i run it I get an error "java.lang.ClassCastException: [B cannot be cast to org.apache.spark.mllib.linalg.Vector"
 Which I am not sure how to work with.
There are plenty examples where Python SparseVectors and DenseVectors are passed over in RDD's and work just fine. Also the training data is sent as a pair of Double, Vector and works fine.
But on the test_data (model.predictOn) it throws the exception.


> Streaming Linear Regression- Python bindings
> --------------------------------------------
>
>                 Key: SPARK-4127
>                 URL: https://issues.apache.org/jira/browse/SPARK-4127
>             Project: Spark
>          Issue Type: Improvement
>          Components: MLlib, PySpark
>            Reporter: Anant Daksh Asthana
>            Priority: Minor
>
> Create python bindings for Streaming Linear Regression (MLlib).
> The Mllib file relevant to this issue can be found at : https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/mllib/StreamingLinearRegression.scala



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org