You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/01/04 15:46:39 UTC

[jira] [Resolved] (SPARK-3313) Add to GraphX library: transitive closure of a graph

     [ https://issues.apache.org/jira/browse/SPARK-3313?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-3313.
------------------------------
    Resolution: Won't Fix

> Add to GraphX library: transitive closure of a graph
> ----------------------------------------------------
>
>                 Key: SPARK-3313
>                 URL: https://issues.apache.org/jira/browse/SPARK-3313
>             Project: Spark
>          Issue Type: New Feature
>          Components: GraphX
>    Affects Versions: 1.3.0
>            Reporter: Rajiv Abraham
>            Priority: Trivial
>
> Hi,
> Would it be useful if I implement a library function to determine the transitive closure of a graph:
> Design: It would be based on the algorithm mentioned in the book Mining Massive Datasets(http://infolab.stanford.edu/~ullman/mmds/book.pdf). The code is in SQL ( which can be translated). There are two proposals, as outlined below
> 1) Smart transitive closure ( section 10.8.5)
> This technique cuts redundancy from the recursive-doubling technique( which doubles the length of path that can be found in each iteration for all the nodes). The smart transitive closure technique avoids discovering the same path more than once.
> 2) Transitive closure by Graph Reduction(section 10.8.6)
> Reduce the graph and based on what SCC each original node belongs to, we can find out the transitive closure of the original graph.
> Smart Transitive closure seems interesting but I will investigate it further (and wait for your feedback). This is a high level proposal. If you think that such a function would be useful, I will provide more detail.
> Best,
> Rajiv



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org