You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2016/11/02 22:58:58 UTC
[jira] [Comment Edited] (SPARK-18200) GraphX Invalid initial
capacity when running triangleCount
[ https://issues.apache.org/jira/browse/SPARK-18200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15630821#comment-15630821 ]
Dongjoon Hyun edited comment on SPARK-18200 at 11/2/16 10:58 PM:
-----------------------------------------------------------------
Actually, there is a node who doesn't have any neighbors. So, it requested to create `VertexSet` with zero initial capacity.
was (Author: dongjoon):
Actually, there is a node whose don't have neighbor. So, it requested to create `VertexSet` with zero initial capacity.
> GraphX Invalid initial capacity when running triangleCount
> ----------------------------------------------------------
>
> Key: SPARK-18200
> URL: https://issues.apache.org/jira/browse/SPARK-18200
> Project: Spark
> Issue Type: Bug
> Components: GraphX
> Affects Versions: 2.0.0, 2.0.1, 2.0.2
> Environment: Databricks, Ubuntu 16.04, macOS Sierra
> Reporter: Denny Lee
> Labels: graph, graphx
>
> Running GraphX triangle count on large-ish file results in the "Invalid initial capacity" error when running on Spark 2.0 (tested on Spark 2.0, 2.0.1, and 2.0.2). You can see the results at: http://bit.ly/2eQKWDN
> Running the same code on Spark 1.6 and the query completes without any problems: http://bit.ly/2fATO1M
> As well, running the GraphFrames version of this code runs as well (Spark 2.0, GraphFrames 0.2): http://bit.ly/2fAS8W8
> Reference Stackoverflow question:
> Spark GraphX: requirement failed: Invalid initial capacity (http://stackoverflow.com/questions/40337366/spark-graphx-requirement-failed-invalid-initial-capacity)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org