You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/05/09 08:49:05 UTC
[jira] [Resolved] (SPARK-20615) SparseVector.argmax throws
IndexOutOfBoundsException when the sparse vector has a size greater than
zero but no elements defined.
[ https://issues.apache.org/jira/browse/SPARK-20615?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-20615.
-------------------------------
Resolution: Fixed
Fix Version/s: 2.1.2
2.2.1
Issue resolved by pull request 17877
[https://github.com/apache/spark/pull/17877]
> SparseVector.argmax throws IndexOutOfBoundsException when the sparse vector has a size greater than zero but no elements defined.
> ---------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-20615
> URL: https://issues.apache.org/jira/browse/SPARK-20615
> Project: Spark
> Issue Type: Bug
> Components: ML, MLlib
> Affects Versions: 2.1.0
> Reporter: Jon McLean
> Priority: Minor
> Fix For: 2.2.1, 2.1.2
>
>
> org.apache.spark.ml.linalg.SparseVector.argmax throws an IndexOutOfRangeException when the vector size is greater than zero and no values are defined. The toString() representation of such a vector is " (100000,[],[])". This is because the argmax function tries to get the value at indexes(0) without checking the size of the array.
> Code inspection reveals that the mllib version of SparseVector should have the same issue.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org