You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiangrui Meng (JIRA)" <ji...@apache.org> on 2014/11/22 03:16:33 UTC
[jira] [Resolved] (SPARK-4431) Implement efficient activeIterator
for dense and sparse vector
[ https://issues.apache.org/jira/browse/SPARK-4431?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiangrui Meng resolved SPARK-4431.
----------------------------------
Resolution: Fixed
Fix Version/s: 1.2.0
Issue resolved by pull request 3288
[https://github.com/apache/spark/pull/3288]
> Implement efficient activeIterator for dense and sparse vector
> --------------------------------------------------------------
>
> Key: SPARK-4431
> URL: https://issues.apache.org/jira/browse/SPARK-4431
> Project: Spark
> Issue Type: Improvement
> Components: MLlib
> Reporter: DB Tsai
> Assignee: DB Tsai
> Fix For: 1.2.0
>
>
> Previously, we were using Breeze's activeIterator to access the non-zero elements
> in dense/sparse vector. Due to the overhead, we switched back to native while loop
> in #SPARK-4129.
> However, #SPARK-4129 requires de-reference the dv.values/sv.values in
> each access to the value, which is very expensive. Also, in MultivariateOnlineSummarizer,
> we're using Breeze's dense vector to store the partial stats, and this is very expensive compared
> with using primitive scala array.
> In this PR, efficient foreachActive is implemented to unify the code path for dense and sparse
> vector operation which makes codebase easier to maintain. Breeze dense vector is replaced
> by primitive array to reduce the overhead further.
> Benchmarking with mnist8m dataset on single JVM
> with first 200 samples loaded in memory, and repeating 5000 times.
> Before change:
> Sparse Vector - 30.02
> Dense Vector - 38.27
> With this PR:
> Sparse Vector - 6.29
> Dense Vector - 11.72
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org