You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrey Vykhodtsev (JIRA)" <ji...@apache.org> on 2015/07/25 13:46:04 UTC

[jira] [Commented] (SPARK-9277) SparseVector constructor must throw an error when declared number of elements less than array length

    [ https://issues.apache.org/jira/browse/SPARK-9277?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14641555#comment-14641555 ] 

Andrey Vykhodtsev commented on SPARK-9277:
------------------------------------------

Hi Joseph,

will it be too expensive performance wize to add the following check : 

max index in the array < size?

>From the correctness perspective it is a better thing to do.


> SparseVector constructor must throw an error when declared number of elements less than array length
> ----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-9277
>                 URL: https://issues.apache.org/jira/browse/SPARK-9277
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.3.1
>            Reporter: Andrey Vykhodtsev
>            Priority: Minor
>              Labels: starter
>         Attachments: SparseVector test.html, SparseVector test.ipynb
>
>
> I found that one can create SparseVector inconsistently and it will lead to an Java error in runtime, for example when training LogisticRegressionWithSGD.
> Here is the test case:
> In [2]:
> sc.version
> Out[2]:
> u'1.3.1'
> In [13]:
> from pyspark.mllib.linalg import SparseVector
> from pyspark.mllib.regression import LabeledPoint
> from pyspark.mllib.classification import LogisticRegressionWithSGD
> In [3]:
> x =  SparseVector(2, {1:1, 2:2, 3:3, 4:4, 5:5})
> In [10]:
> l = LabeledPoint(0, x)
> In [12]:
> r = sc.parallelize([l])
> In [14]:
> m = LogisticRegressionWithSGD.train(r)
> Error:
> Py4JJavaError: An error occurred while calling o86.trainLogisticRegressionModelWithSGD.
> : org.apache.spark.SparkException: Job aborted due to stage failure: Task 7 in stage 11.0 failed 1 times, most recent failure: Lost task 7.0 in stage 11.0 (TID 47, localhost): java.lang.ArrayIndexOutOfBoundsException: 2
> Attached is the notebook with the scenario and the full message



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org