You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by gi...@apache.org on 2021/03/13 18:04:24 UTC

[beam] branch asf-site updated: Publishing website 2021/03/13 18:03:37 at commit 153876f

This is an automated email from the ASF dual-hosted git repository.

git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 5f1c9c1  Publishing website 2021/03/13 18:03:37 at commit 153876f
5f1c9c1 is described below

commit 5f1c9c14fc67ee826b46c02ed2336aae1f64b9b3
Author: jenkins <bu...@apache.org>
AuthorDate: Sat Mar 13 18:03:37 2021 +0000

    Publishing website 2021/03/13 18:03:37 at commit 153876f
---
 website/generated-content/contribute/index.xml                      | 2 +-
 website/generated-content/contribute/release-guide/index.html       | 4 ++--
 website/generated-content/documentation/runners/spark/index.html    | 6 +++---
 .../documentation/sdks/java/testing/nexmark/index.html              | 4 ++--
 website/generated-content/sitemap.xml                               | 2 +-
 5 files changed, 9 insertions(+), 9 deletions(-)

diff --git a/website/generated-content/contribute/index.xml b/website/generated-content/contribute/index.xml
index e2cf503..762515e 100644
--- a/website/generated-content/contribute/index.xml
+++ b/website/generated-content/contribute/index.xml
@@ -1231,7 +1231,7 @@ In case of script failure, you can still run all of them manually.&lt;/p>
 -Prepourl=https://repository.apache.org/content/repositories/orgapachebeam-${KEY} \
 -Pver=${RELEASE_VERSION}
 &lt;/code>&lt;/pre>&lt;p>&lt;strong>Spark Local Runner&lt;/strong>&lt;/p>
-&lt;pre>&lt;code>./gradlew :runners:spark:runQuickstartJavaSpark \
+&lt;pre>&lt;code>./gradlew :runners:spark:2:runQuickstartJavaSpark \
 -Prepourl=https://repository.apache.org/content/repositories/orgapachebeam-${KEY} \
 -Pver=${RELEASE_VERSION}
 &lt;/code>&lt;/pre>&lt;p>&lt;strong>Dataflow Runner&lt;/strong>&lt;/p>
diff --git a/website/generated-content/contribute/release-guide/index.html b/website/generated-content/contribute/release-guide/index.html
index 7591415..c80ee86 100644
--- a/website/generated-content/contribute/release-guide/index.html
+++ b/website/generated-content/contribute/release-guide/index.html
@@ -309,7 +309,7 @@ In case of script failure, you can still run all of them manually.</p><h4 id=run
 </code></pre><p><strong>Flink Local Runner</strong></p><pre><code>./gradlew :runners:flink:1.10:runQuickstartJavaFlinkLocal \
 -Prepourl=https://repository.apache.org/content/repositories/orgapachebeam-${KEY} \
 -Pver=${RELEASE_VERSION}
-</code></pre><p><strong>Spark Local Runner</strong></p><pre><code>./gradlew :runners:spark:runQuickstartJavaSpark \
+</code></pre><p><strong>Spark Local Runner</strong></p><pre><code>./gradlew :runners:spark:2:runQuickstartJavaSpark \
 -Prepourl=https://repository.apache.org/content/repositories/orgapachebeam-${KEY} \
 -Pver=${RELEASE_VERSION}
 </code></pre><p><strong>Dataflow Runner</strong></p><pre><code>./gradlew :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow \
@@ -425,7 +425,7 @@ If you end up getting permissions errors ask on the mailing list for assistance.
 Ask other contributors to do the same.</p><p>Also, update <a href=https://en.wikipedia.org/wiki/Apache_Beam>the Wikipedia article on Apache Beam</a>.</p><h3 id=checklist-to-declare-the-process-completed>Checklist to declare the process completed</h3><ol><li>Release announced on the user@ mailing list.</li><li>Blog post published, if applicable.</li><li>Release recorded in reporter.apache.org.</li><li>Release announced on social media.</li><li>Completion declared on the dev@ mailing list. [...]
 Once you’ve finished the release, please take a step back and look what areas of this process and be improved. Perhaps some part of the process can be simplified.
 Perhaps parts of this guide can be clarified.</p><p>If we have specific ideas, please start a discussion on the dev@ mailing list and/or propose a pull request to update this guide.
-Thanks!</p><div class=feedback><p class=update>Last updated on 2021/02/02</p><h3>Have you found everything you were looking for?</h3><p class=description>Was it all useful and clear? Is there anything that you would like to change? Let us know!</p><button class=load-button><a href="mailto:dev@beam.apache.org?subject=Beam Website Feedback">SEND FEEDBACK</a></button></div></div></div><footer class=footer><div class=footer__contained><div class=footer__cols><div class="footer__cols__col foo [...]
+Thanks!</p><div class=feedback><p class=update>Last updated on 2021/01/22</p><h3>Have you found everything you were looking for?</h3><p class=description>Was it all useful and clear? Is there anything that you would like to change? Let us know!</p><button class=load-button><a href="mailto:dev@beam.apache.org?subject=Beam Website Feedback">SEND FEEDBACK</a></button></div></div></div><footer class=footer><div class=footer__contained><div class=footer__cols><div class="footer__cols__col foo [...]
 <a href=http://www.apache.org>The Apache Software Foundation</a>
 | <a href=/privacy_policy>Privacy Policy</a>
 | <a href=/feed.xml>RSS Feed</a><br><br>Apache Beam, Apache, Beam, the Beam logo, and the Apache feather logo are either registered trademarks or trademarks of The Apache Software Foundation. All other products or name brands are trademarks of their respective holders, including The Apache Software Foundation.</div></div></div></div></footer></body></html>
\ No newline at end of file
diff --git a/website/generated-content/documentation/runners/spark/index.html b/website/generated-content/documentation/runners/spark/index.html
index a396364..0ecd032 100644
--- a/website/generated-content/documentation/runners/spark/index.html
+++ b/website/generated-content/documentation/runners/spark/index.html
@@ -77,7 +77,7 @@ the portable Runner. For more information on portability, please visit the
 Apache Beam with Python you have to install the Apache Beam Python SDK: <code>pip install apache_beam</code>. Please refer to the <a href=/documentation/sdks/python/>Python documentation</a>
 on how to create a Python pipeline.</p><div class="language-py snippet"><div class="notebook-skip code-snippet"><a class=copy type=button data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img src=/images/copy-icon.svg></a><div class=highlight><pre class=chroma><code class=language-py data-lang=py><span class=n>pip</span> <span class=n>install</span> <span class=n>apache_beam</span></code></pre></div></div></div><p class=language-py>Starting from Beam 2.20.0, pre- [...]
 <a href=https://hub.docker.com/r/apache/beam_spark_job_server>Docker Hub</a>.</p><p class=language-py>For older Beam versions, you will need a copy of Apache Beam&rsquo;s source code. You can
-download it on the <a href=/get-started/downloads/>Downloads page</a>.</p><p class=language-py><ol><li>Start the JobService endpoint:<ul><li>with Docker (preferred): <code>docker run --net=host apache/beam_spark_job_server:latest</code></li><li>or from Beam source code: <code>./gradlew :runners:spark:job-server:runShadow</code></li></ul></li></ol></p><p class=language-py>The JobService is the central instance where you submit your Beam pipeline.
+download it on the <a href=/get-started/downloads/>Downloads page</a>.</p><p class=language-py><ol><li>Start the JobService endpoint:<ul><li>with Docker (preferred): <code>docker run --net=host apache/beam_spark_job_server:latest</code></li><li>or from Beam source code: <code>./gradlew :runners:spark:2:job-server:runShadow</code></li></ul></li></ol></p><p class=language-py>The JobService is the central instance where you submit your Beam pipeline.
 The JobService will create a Spark job for the pipeline and execute the
 job. To execute the job on a Spark cluster, the Beam JobService needs to be
 provided with the Spark master address.</p><p class=language-py><ol start=2><li>Submit the Python pipeline to the above endpoint by using the <code>PortableRunner</code>, <code>job_endpoint</code> set to <code>localhost:8099</code> (this is the default address of the JobService), and <code>environment_type</code> set to <code>LOOPBACK</code>. For example:</li></ol></p><div class="language-py snippet"><div class="notebook-skip code-snippet"><a class=copy type=button data-bs-toggle=tooltip [...]
@@ -90,7 +90,7 @@ provided with the Spark master address.</p><p class=language-py><ol start=2><li>
 <span class=p>])</span>
 <span class=k>with</span> <span class=n>beam</span><span class=o>.</span><span class=n>Pipeline</span><span class=p>(</span><span class=n>options</span><span class=p>)</span> <span class=k>as</span> <span class=n>p</span><span class=p>:</span>
     <span class=o>...</span></code></pre></div></div></div><h3 id=running-on-a-pre-deployed-spark-cluster>Running on a pre-deployed Spark cluster</h3><p>Deploying your Beam pipeline on a cluster that already has a Spark deployment (Spark classes are available in container classpath) does not require any additional dependencies.
-For more details on the different deployment modes see: <a href=https://spark.apache.org/docs/latest/spark-standalone.html>Standalone</a>, <a href=https://spark.apache.org/docs/latest/running-on-yarn.html>YARN</a>, or <a href=https://spark.apache.org/docs/latest/running-on-mesos.html>Mesos</a>.</p><p class=language-py><ol><li>Start a Spark cluster which exposes the master on port 7077 by default.</li></ol></p><p class=language-py><ol start=2><li>Start JobService that will connect with th [...]
+For more details on the different deployment modes see: <a href=https://spark.apache.org/docs/latest/spark-standalone.html>Standalone</a>, <a href=https://spark.apache.org/docs/latest/running-on-yarn.html>YARN</a>, or <a href=https://spark.apache.org/docs/latest/running-on-mesos.html>Mesos</a>.</p><p class=language-py><ol><li>Start a Spark cluster which exposes the master on port 7077 by default.</li></ol></p><p class=language-py><ol start=2><li>Start JobService that will connect with th [...]
 Note however that <code>environment_type=LOOPBACK</code> is only intended for local testing.
 See <a href=/roadmap/portability/#sdk-harness-config>here</a> for details.</li></ol></p><p class=language-py>(Note that, depending on your cluster setup, you may need to change the <code>environment_type</code> option.
 See <a href=/roadmap/portability/#sdk-harness-config>here</a> for details.)</p><h2 id=pipeline-options-for-the-spark-runner>Pipeline options for the Spark Runner</h2><p>When executing your pipeline with the Spark Runner, you should consider the following pipeline options.</p><p class=language-java><br><b>For RDD/DStream based runner:</b><br></p><table class="language-java table table-bordered"><tr><th>Field</th><th>Description</th><th>Default Value</th></tr><tr><td><code>runner</code></t [...]
@@ -99,7 +99,7 @@ Passing any of the above mentioned options could be done as one of the <code>app
 For more on how to generally use <code>spark-submit</code> checkout Spark <a href=https://spark.apache.org/docs/latest/submitting-applications.html#launching-applications-with-spark-submit>documentation</a>.</p><h3 id=monitoring-your-job>Monitoring your job</h3><p>You can monitor a running Spark job using the Spark <a href=https://spark.apache.org/docs/latest/monitoring.html#web-interfaces>Web Interfaces</a>. By default, this is available at port <code>4040</code> on the driver node. If  [...]
 Spark also has a history server to <a href=https://spark.apache.org/docs/latest/monitoring.html#viewing-after-the-fact>view after the fact</a>.<p class=language-java>Metrics are also available via <a href=https://spark.apache.org/docs/latest/monitoring.html#rest-api>REST API</a>.
 Spark provides a <a href=https://spark.apache.org/docs/latest/monitoring.html#metrics>metrics system</a> that allows reporting Spark metrics to a variety of Sinks. The Spark runner reports user-defined Beam Aggregators using this same metrics system and currently supports <code>GraphiteSink</code> and <code>CSVSink</code>, and providing support for additional Sinks supported by Spark is easy and straight-forward.</p><p class=language-py>Spark metrics are not yet supported on the portable [...]
-Instead, you should use <code>SparkContextOptions</code> which can only be used programmatically and is not a common <code>PipelineOptions</code> implementation.<br><br><b>For Structured Streaming based runner:</b><br>Provided SparkSession and StreamingListeners are not supported on the Spark Structured Streaming runner</p><p class=language-py>Provided SparkContext and StreamingListeners are not supported on the Spark portable runner.</p><div class=feedback><p class=update>Last updated o [...]
+Instead, you should use <code>SparkContextOptions</code> which can only be used programmatically and is not a common <code>PipelineOptions</code> implementation.<br><br><b>For Structured Streaming based runner:</b><br>Provided SparkSession and StreamingListeners are not supported on the Spark Structured Streaming runner</p><p class=language-py>Provided SparkContext and StreamingListeners are not supported on the Spark portable runner.</p><div class=feedback><p class=update>Last updated o [...]
 <a href=http://www.apache.org>The Apache Software Foundation</a>
 | <a href=/privacy_policy>Privacy Policy</a>
 | <a href=/feed.xml>RSS Feed</a><br><br>Apache Beam, Apache, Beam, the Beam logo, and the Apache feather logo are either registered trademarks or trademarks of The Apache Software Foundation. All other products or name brands are trademarks of their respective holders, including The Apache Software Foundation.</div></div></div></div></footer></body></html>
\ No newline at end of file
diff --git a/website/generated-content/documentation/sdks/java/testing/nexmark/index.html b/website/generated-content/documentation/sdks/java/testing/nexmark/index.html
index 389b7c6..5ad2d77 100644
--- a/website/generated-content/documentation/sdks/java/testing/nexmark/index.html
+++ b/website/generated-content/documentation/sdks/java/testing/nexmark/index.html
@@ -125,7 +125,7 @@ SMOKE suite can make sure there is nothing broken in the Nexmark suite.</p><p>Ba
 </code></pre><h3 id=running-smoke-suite-on-the-sparkrunner-local>Running SMOKE suite on the SparkRunner (local)</h3><p>The SparkRunner is special-cased in the Nexmark gradle launch. The task will
 provide the version of Spark that the SparkRunner is built against, and
 configure logging.</p><p>Batch Mode:</p><pre><code>./gradlew :sdks:java:testing:nexmark:run \
-    -Pnexmark.runner=&quot;:runners:spark&quot; \
+    -Pnexmark.runner=&quot;:runners:spark:2&quot; \
     -Pnexmark.args=&quot;
         --runner=SparkRunner
         --suite=SMOKE
@@ -134,7 +134,7 @@ configure logging.</p><p>Batch Mode:</p><pre><code>./gradlew :sdks:java:testing:
         --manageResources=false
         --monitorJobs=true&quot;
 </code></pre><p>Streaming Mode:</p><pre><code>./gradlew :sdks:java:testing:nexmark:run \
-    -Pnexmark.runner=&quot;:runners:spark&quot; \
+    -Pnexmark.runner=&quot;:runners:spark:2&quot; \
     -Pnexmark.args=&quot;
         --runner=SparkRunner
         --suite=SMOKE
diff --git a/website/generated-content/sitemap.xml b/website/generated-content/sitemap.xml
index 2e8815d..2bc720f 100644
--- a/website/generated-content/sitemap.xml
+++ b/website/generated-content/sitemap.xml
@@ -1 +1 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>/blog/beam-2.28.0/</loc><lastmod>2021-02-22T11:40:20-08:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2021-02-22T11:40:20-08:00</lastmod></url><url><loc>/blog/</loc><lastmod>2021-02-22T11:40:20-08:00</lastmod></url><url><loc>/categories/</loc><lastmod>2021-02-23T13:40:55+01:00</lastmod></url><url><loc>/blog/k [...]
\ No newline at end of file
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>/blog/beam-2.28.0/</loc><lastmod>2021-02-22T11:40:20-08:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2021-02-22T11:40:20-08:00</lastmod></url><url><loc>/blog/</loc><lastmod>2021-02-22T11:40:20-08:00</lastmod></url><url><loc>/categories/</loc><lastmod>2021-02-23T13:40:55+01:00</lastmod></url><url><loc>/blog/k [...]
\ No newline at end of file