You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/01/10 03:58:00 UTC
[jira] [Commented] (SPARK-30441) Improve the memory usage in
StronglyConnectedComponents
[ https://issues.apache.org/jira/browse/SPARK-30441?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17012432#comment-17012432 ]
Dongjoon Hyun commented on SPARK-30441:
---------------------------------------
Hi, [~jmzhou]. Please don't set `Fixed Version`. We use that when the committers merge the PRs.
- https://spark.apache.org/contributing.html
Also, `New Feature` and `Improvement` should have the version of `master` branch because Apache Spark community backports only bug fixes.
> Improve the memory usage in StronglyConnectedComponents
> -------------------------------------------------------
>
> Key: SPARK-30441
> URL: https://issues.apache.org/jira/browse/SPARK-30441
> Project: Spark
> Issue Type: Improvement
> Components: GraphX
> Affects Versions: 3.0.0
> Reporter: jiamuzhou
> Priority: Major
> Attachments: figure1.png, figure2.png
>
>
> This is very consume memory when It use StronglyConnectedComponents(see figure1.png). Because there is no mark the Graph/RDD as non-persistent in the iterative process timely. And it is maybe lead to fail in the big graph.
> In order to improve the memory usage, it is verty important to mark the Graph/RDD as non-persistent timely. In the current code, only make the Graph/RDD as non-persistent for 'sccGraph' but not for 'sccWorkGraph' in degree's step and pregel's step.
> I have done a optimized code proposal(see my fork:[https://github.com/jmzhoulab/spark/blob/master/graphx/src/main/scala/org/apache/spark/graphx/lib/StronglyConnectedComponents.scala])
> The storage after optimization see figure2.png
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org