You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xi Shen (JIRA)" <ji...@apache.org> on 2015/04/04 11:22:33 UTC

[jira] [Issue Comment Deleted] (SPARK-6706) kmeans|| hangs for a long time if both k and vector dimension are large

     [ https://issues.apache.org/jira/browse/SPARK-6706?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xi Shen updated SPARK-6706:
---------------------------
    Comment: was deleted

(was: Yes, the {{collect()}} jobs finished, then hangs at the driver. Your words are more accurate.

But I don't observe this behavior with the *random initialization* of k-means. I think it is because the *kmeans||* algorithm has a more complex initialize algorithm.)

> kmeans|| hangs for a long time if both k and vector dimension are large
> -----------------------------------------------------------------------
>
>                 Key: SPARK-6706
>                 URL: https://issues.apache.org/jira/browse/SPARK-6706
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.2.1, 1.3.0
>         Environment: Windows 64bit, Linux 64bit
>            Reporter: Xi Shen
>            Assignee: Xiangrui Meng
>              Labels: performance
>         Attachments: kmeans-debug.7z
>
>
> When doing k-means cluster with the "kmeans||" algorithm which is the default one. The algorithm finished some {{collect()}} jobs, then the *driver* hangs for a long time.
> Settings:
> - k above 100
> - feature dimension about 360
> - total data size is about 100 MB
> The issue was first noticed with Spark 1.2.1. I tested with both local and cluster mode. On Spark 1.3.0. I, I can also reproduce this issue with local mode. **However, I do not have a 1.3.0 cluster environment for me to test.**



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org