You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Perinkulam I Ganesh (JIRA)" <ji...@apache.org> on 2015/06/26 23:27:05 UTC

[jira] [Commented] (SPARK-6830) Memoize frequently queried vals in RDD, such as numPartitions, count etc.

    [ https://issues.apache.org/jira/browse/SPARK-6830?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14603619#comment-14603619 ] 

Perinkulam I Ganesh commented on SPARK-6830:
--------------------------------------------

If we cache it locally within RDD, then it can be done as follows:

private val mycache = scala.collection.mutable.Map.empty[String, Long]
 
def newcount(): Long = {
    mycache.getOrElseUpdate("count", sc.runJob(this, Utils.getIteratorSize _).sum)
 }

Or do we need to modify the cacheManager code to cache these results along with others?

thanks

> Memoize frequently queried vals in RDD, such as numPartitions, count etc.
> -------------------------------------------------------------------------
>
>                 Key: SPARK-6830
>                 URL: https://issues.apache.org/jira/browse/SPARK-6830
>             Project: Spark
>          Issue Type: Improvement
>          Components: SparkR
>            Reporter: Shivaram Venkataraman
>            Priority: Minor
>              Labels: Starter
>
> We should memoize frequently queried vals in RDD, such as numPartitions, count etc.
> While using SparkR in RStudio, the `count` function seems to be called frequently by the IDE – I think this is to show some stats about variables in the workspace etc. but this is not great in SparkR as we trigger a job every time count is called.
> Memoization would help in this case, but we should also see if there is some better way to interact with RStudio.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org