You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2017/05/11 20:01:04 UTC
[jira] [Updated] (SPARK-20615) SparseVector.argmax throws
IndexOutOfBoundsException when the sparse vector has a size greater than
zero but no elements defined.
[ https://issues.apache.org/jira/browse/SPARK-20615?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiao Li updated SPARK-20615:
----------------------------
Fix Version/s: (was: 2.2.1)
2.2.0
> SparseVector.argmax throws IndexOutOfBoundsException when the sparse vector has a size greater than zero but no elements defined.
> ---------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-20615
> URL: https://issues.apache.org/jira/browse/SPARK-20615
> Project: Spark
> Issue Type: Bug
> Components: ML, MLlib
> Affects Versions: 2.1.0
> Reporter: Jon McLean
> Assignee: Jon McLean
> Priority: Minor
> Fix For: 2.1.2, 2.2.0
>
>
> org.apache.spark.ml.linalg.SparseVector.argmax throws an IndexOutOfRangeException when the vector size is greater than zero and no values are defined. The toString() representation of such a vector is " (100000,[],[])". This is because the argmax function tries to get the value at indexes(0) without checking the size of the array.
> Code inspection reveals that the mllib version of SparseVector should have the same issue.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org