You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2014/10/02 17:25:35 UTC
[jira] [Commented] (SPARK-3623) Graph should support the checkpoint
operation
[ https://issues.apache.org/jira/browse/SPARK-3623?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14156664#comment-14156664 ]
Apache Spark commented on SPARK-3623:
-------------------------------------
User 'witgo' has created a pull request for this issue:
https://github.com/apache/spark/pull/2631
> Graph should support the checkpoint operation
> ---------------------------------------------
>
> Key: SPARK-3623
> URL: https://issues.apache.org/jira/browse/SPARK-3623
> Project: Spark
> Issue Type: Improvement
> Components: GraphX
> Affects Versions: 1.0.2, 1.1.0
> Reporter: Guoqiang Li
> Priority: Critical
>
> Consider the following code:
> {code}
> for (i <- 0 until totalIter) {
> val previousCorpus = corpus
> logInfo("Start Gibbs sampling (Iteration %d/%d)".format(i, totalIter))
> val corpusTopicDist = collectTermTopicDist(corpus, globalTopicCounter, sumTerms,
> numTerms, numTopics, alpha, beta).persist(storageLevel)
> val corpusSampleTopics = sampleTopics(corpusTopicDist, globalTopicCounter, sumTerms, numTerms,
> numTopics, alpha, beta).persist(storageLevel)
> corpus = updateCounter(corpusSampleTopics, numTopics).persist(storageLevel)
> globalTopicCounter = collectGlobalCounter(corpus, numTopics)
> assert(bsum(globalTopicCounter) == sumTerms)
> previousCorpus.unpersistVertices()
> corpusTopicDist.unpersistVertices()
> corpusSampleTopics.unpersistVertices()
> }
> {code}
> If there is no checkpoint operation will appear the following problems.
> 1. The RDD of corpus dependencies are too deep
> 2. The shuffle files are too large.
> 3. Any of a server crash will cause the algorithm to recalculate
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org