You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/06/23 17:14:25 UTC

[jira] [Commented] (SPARK-8525) Bug in Streaming k-means documentation

    [ https://issues.apache.org/jira/browse/SPARK-8525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14597789#comment-14597789 ] 

Apache Spark commented on SPARK-8525:
-------------------------------------

User 'fe2s' has created a pull request for this issue:
https://github.com/apache/spark/pull/6954

> Bug in Streaming k-means documentation
> --------------------------------------
>
>                 Key: SPARK-8525
>                 URL: https://issues.apache.org/jira/browse/SPARK-8525
>             Project: Spark
>          Issue Type: Bug
>          Components: Documentation, MLlib
>    Affects Versions: 1.4.0
>            Reporter: Oleksiy Dyagilev
>            Priority: Minor
>
> The expected input format is wrong in Streaming K-means documentation.
> https://spark.apache.org/docs/latest/mllib-clustering.html#streaming-k-means
> It might be a bug in implementation though, not sure.
> There shouldn't be any spaces in test data points. I.e. instead of 
> (y, [x1, x2, x3]) it should be
> (y,[x1,x2,x3])
> The exception thrown 
> org.apache.spark.SparkException: Cannot parse a double from:  
> 	at org.apache.spark.mllib.util.NumericParser$.parseDouble(NumericParser.scala:118)
> 	at org.apache.spark.mllib.util.NumericParser$.parseTuple(NumericParser.scala:103)
> 	at org.apache.spark.mllib.util.NumericParser$.parse(NumericParser.scala:41)
> 	at org.apache.spark.mllib.regression.LabeledPoint$.parse(LabeledPoint.scala:49)
> Also I would improve documentation saying explicitly that expected data types for both 'x' and 'y' is Double. At the moment it's not obvious especially for 'y'. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org