You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Michael Malak <mi...@yahoo.com.INVALID> on 2015/01/19 05:34:02 UTC

GraphX doc: triangleCount() requirement overstatement?

According to:
https://spark.apache.org/docs/1.2.0/graphx-programming-guide.html#triangle-counting 

"Note that TriangleCount requires the edges to be in canonical orientation (srcId < dstId)"

But isn't this overstating the requirement? Isn't the requirement really that IF there are duplicate edges between two vertices, THEN those edges must all be in the same direction (in order for the groupEdges() at the beginning of triangleCount() to produce the intermediate results that triangleCount() expects)?

If so, should I enter a JIRA ticket to clarify the documentation?

Or is it the case that https://issues.apache.org/jira/browse/SPARK-3650 will make it into Spark 1.3 anyway?

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: GraphX doc: triangleCount() requirement overstatement?

Posted by Reynold Xin <rx...@databricks.com>.
We will merge https://issues.apache.org/jira/browse/SPARK-3650  for 1.3.
Thanks for reminding!


On Sun, Jan 18, 2015 at 8:34 PM, Michael Malak <
michaelmalak@yahoo.com.invalid> wrote:

> According to:
>
> https://spark.apache.org/docs/1.2.0/graphx-programming-guide.html#triangle-counting
>
> "Note that TriangleCount requires the edges to be in canonical orientation
> (srcId < dstId)"
>
> But isn't this overstating the requirement? Isn't the requirement really
> that IF there are duplicate edges between two vertices, THEN those edges
> must all be in the same direction (in order for the groupEdges() at the
> beginning of triangleCount() to produce the intermediate results that
> triangleCount() expects)?
>
> If so, should I enter a JIRA ticket to clarify the documentation?
>
> Or is it the case that https://issues.apache.org/jira/browse/SPARK-3650
> will make it into Spark 1.3 anyway?
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>