You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2016/04/24 11:36:40 UTC

spark git commit: [DOCS][MINOR] Screenshot + minor fixes to improve reading for accumulators

Repository: spark
Updated Branches:
  refs/heads/master db7113b1d -> 8df8a8182


[DOCS][MINOR] Screenshot + minor fixes to improve reading for accumulators

## What changes were proposed in this pull request?

Added screenshot + minor fixes to improve reading

## How was this patch tested?

Manual

Author: Jacek Laskowski <ja...@japila.pl>

Closes #12569 from jaceklaskowski/docs-accumulators.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/8df8a818
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/8df8a818
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/8df8a818

Branch: refs/heads/master
Commit: 8df8a81825709dbefe5aecd7642748c1b3a38e99
Parents: db7113b
Author: Jacek Laskowski <ja...@japila.pl>
Authored: Sun Apr 24 10:36:33 2016 +0100
Committer: Sean Owen <so...@cloudera.com>
Committed: Sun Apr 24 10:36:33 2016 +0100

----------------------------------------------------------------------
 docs/img/spark-webui-accumulators.png | Bin 0 -> 231065 bytes
 docs/programming-guide.md             |  18 ++++++++++++------
 2 files changed, 12 insertions(+), 6 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/8df8a818/docs/img/spark-webui-accumulators.png
----------------------------------------------------------------------
diff --git a/docs/img/spark-webui-accumulators.png b/docs/img/spark-webui-accumulators.png
new file mode 100644
index 0000000..237052d
Binary files /dev/null and b/docs/img/spark-webui-accumulators.png differ

http://git-wip-us.apache.org/repos/asf/spark/blob/8df8a818/docs/programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/programming-guide.md b/docs/programming-guide.md
index 2f0ed5e..f398e38 100644
--- a/docs/programming-guide.md
+++ b/docs/programming-guide.md
@@ -1328,12 +1328,18 @@ value of the broadcast variable (e.g. if the variable is shipped to a new node l
 Accumulators are variables that are only "added" to through an associative and commutative operation and can
 therefore be efficiently supported in parallel. They can be used to implement counters (as in
 MapReduce) or sums. Spark natively supports accumulators of numeric types, and programmers
-can add support for new types. If accumulators are created with a name, they will be
+can add support for new types.
+
+If accumulators are created with a name, they will be
 displayed in Spark's UI. This can be useful for understanding the progress of
 running stages (NOTE: this is not yet supported in Python).
 
+<p style="text-align: center;">
+  <img src="img/spark-webui-accumulators.png" title="Accumulators in the Spark UI" alt="Accumulators in the Spark UI" />
+</p>
+
 An accumulator is created from an initial value `v` by calling `SparkContext.accumulator(v)`. Tasks
-running on the cluster can then add to it using the `add` method or the `+=` operator (in Scala and Python).
+running on a cluster can then add to it using the `add` method or the `+=` operator (in Scala and Python).
 However, they cannot read its value.
 Only the driver program can read the accumulator's value, using its `value` method.
 
@@ -1345,7 +1351,7 @@ The code below shows an accumulator being used to add up the elements of an arra
 
 {% highlight scala %}
 scala> val accum = sc.accumulator(0, "My Accumulator")
-accum: spark.Accumulator[Int] = 0
+accum: org.apache.spark.Accumulator[Int] = 0
 
 scala> sc.parallelize(Array(1, 2, 3, 4)).foreach(x => accum += x)
 ...
@@ -1466,11 +1472,11 @@ Accumulators do not change the lazy evaluation model of Spark. If they are being
 
 <div class="codetabs">
 
-<div data-lang="scala"  markdown="1">
+<div data-lang="scala" markdown="1">
 {% highlight scala %}
 val accum = sc.accumulator(0)
-data.map { x => accum += x; f(x) }
-// Here, accum is still 0 because no actions have caused the <code>map</code> to be computed.
+data.map { x => accum += x; x }
+// Here, accum is still 0 because no actions have caused the map operation to be computed.
 {% endhighlight %}
 </div>
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org