You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by uc...@apache.org on 2017/02/20 14:44:42 UTC

[1/4] flink-web git commit: Use global variable on links to documentation

Repository: flink-web
Updated Branches:
  refs/heads/asf-site 2764a5116 -> 5926684a3


Use global variable on links to documentation


Project: http://git-wip-us.apache.org/repos/asf/flink-web/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink-web/commit/346e66eb
Tree: http://git-wip-us.apache.org/repos/asf/flink-web/tree/346e66eb
Diff: http://git-wip-us.apache.org/repos/asf/flink-web/diff/346e66eb

Branch: refs/heads/asf-site
Commit: 346e66eb159c2f463ac1e6f2f93ce5a9d000a31c
Parents: 6b8c017
Author: wints <mw...@gmail.com>
Authored: Tue Feb 14 13:58:44 2017 +0100
Committer: Ufuk Celebi <uc...@apache.org>
Committed: Mon Feb 20 15:44:16 2017 +0100

----------------------------------------------------------------------
 ecosystem.md    | 18 +++++++++---------
 introduction.md | 14 +++++++-------
 2 files changed, 16 insertions(+), 16 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/flink-web/blob/346e66eb/ecosystem.md
----------------------------------------------------------------------
diff --git a/ecosystem.md b/ecosystem.md
index ce21fd2..19069fd 100644
--- a/ecosystem.md
+++ b/ecosystem.md
@@ -14,15 +14,15 @@ many other data processing projects and frameworks.
 <p>Currently these systems are supported:</p>
 
 <ul>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/kafka.html" target="_blank">Apache Kafka</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/elasticsearch.html" target="_blank">Elasticsearch</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/elasticsearch2.html" target="_blank">Elasticsearch 2.x</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/filesystem_sink.html" target="_blank">HDFS</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/rabbitmq.html" target="_blank">RabbitMQ</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/kinesis.html" target="_blank">Amazon Kinesis Streams</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/twitter.html" target="_blank">Twitter</a> (source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/nifi.html" target="_blank">Apache NiFi</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/cassandra.html" target="_blank">Apache Cassandra</a> (sink)</li>
+  <li><a href="{{site.docs-stable}}/dev/connectors/kafka.html" target="_blank">Apache Kafka</a> (sink/source)</li>
+  <li><a href="{{site.docs-stable}}/dev/connectors/elasticsearch.html" target="_blank">Elasticsearch</a> (sink)</li>
+  <li><a href="{{site.docs-stable}}/dev/connectors/elasticsearch2.html" target="_blank">Elasticsearch 2.x</a> (sink)</li>
+  <li><a href="{{site.docs-stable}}/dev/connectors/filesystem_sink.html" target="_blank">HDFS</a> (sink)</li>
+  <li><a href="{{site.docs-stable}}/dev/connectors/rabbitmq.html" target="_blank">RabbitMQ</a> (sink/source)</li>
+  <li><a href="{{site.docs-stable}}/dev/connectors/kinesis.html" target="_blank">Amazon Kinesis Streams</a> (sink/source)</li>
+  <li><a href="{{site.docs-stable}}/dev/connectors/twitter.html" target="_blank">Twitter</a> (source)</li>
+  <li><a href="{{site.docs-stable}}/dev/connectors/nifi.html" target="_blank">Apache NiFi</a> (sink/source)</li>
+  <li><a href="{{site.docs-stable}}/dev/connectors/cassandra.html" target="_blank">Apache Cassandra</a> (sink)</li>
   <li><a href="https://github.com/apache/bahir-flink" target="_blank">Redis, Flume, and ActiveMQ (via Apache Bahir)</a> (sink)</li>
 </ul>
 

http://git-wip-us.apache.org/repos/asf/flink-web/blob/346e66eb/introduction.md
----------------------------------------------------------------------
diff --git a/introduction.md b/introduction.md
index 2011f87..9ff8235 100644
--- a/introduction.md
+++ b/introduction.md
@@ -2,7 +2,7 @@
 title: "Introduction to Apache Flink�"
 ---
 <br>
-Below is a high-level overview of Apache Flink and stream processing. For a more technical introduction, we recommend the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/concepts/programming-model.html" target="_blank">"Concepts" page</a> in the Flink documentation.
+Below is a high-level overview of Apache Flink and stream processing. For a more technical introduction, we recommend the <a href="{{site.docs-stable}}/concepts/programming-model.html" target="_blank">"Concepts" page</a> in the Flink documentation.
 <br>
 {% toc %}
 
@@ -96,13 +96,13 @@ Flink\u2019s core is a distributed streaming dataflow engine, meaning that data is
 
 ### APIs
 
-+ Flink\u2019s <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/datastream_api.html" target="_blank">DataStream API</a> is for programs that implement transformations on data streams (e.g., filtering, updating state, defining windows, aggregating).
-+ The <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/batch/index.html" target="_blank">DataSet API</a> is for programs that implement transformations on data sets (e.g., filtering, mapping, joining, grouping).
-+ The <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/table_api.html#table-api" target="_blank">Table API</a> is a SQL-like expression language for relational stream and batch processing that can be easily embedded in Flink\u2019s DataSet and DataStream APIs (Java and Scala).
-+ <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/table_api.html#sql" target="_blank">Streaming SQL</a> enables SQL queries to be executed on streaming and batch tables. The syntax is based on <a href="https://calcite.apache.org/docs/stream.html" target="_blank">Apache Calcite\u2122</a>.
++ Flink\u2019s <a href="{{site.docs-stable}}/dev/datastream_api.html" target="_blank">DataStream API</a> is for programs that implement transformations on data streams (e.g., filtering, updating state, defining windows, aggregating).
++ The <a href="{{site.docs-stable}}/dev/batch/index.html" target="_blank">DataSet API</a> is for programs that implement transformations on data sets (e.g., filtering, mapping, joining, grouping).
++ The <a href="{{site.docs-stable}}/dev/table_api.html#table-api" target="_blank">Table API</a> is a SQL-like expression language for relational stream and batch processing that can be easily embedded in Flink\u2019s DataSet and DataStream APIs (Java and Scala).
++ <a href="{{site.docs-stable}}/dev/table_api.html#sql" target="_blank">Streaming SQL</a> enables SQL queries to be executed on streaming and batch tables. The syntax is based on <a href="https://calcite.apache.org/docs/stream.html" target="_blank">Apache Calcite\u2122</a>.
 
 ### Libraries
-Flink also includes special-purpose libraries for <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/cep.html" target="_blank">complex event processing</a>, <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/ml/index.html" target="_blank">machine learning</a>, <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/gelly/index.html" target="_blank">graph processing</a>, and <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/storm_compatibility.html" target="_blank">Apache Storm compatibility</a>.
+Flink also includes special-purpose libraries for <a href="{{site.docs-stable}}/dev/libs/cep.html" target="_blank">complex event processing</a>, <a href="{{site.docs-stable}}/dev/libs/ml/index.html" target="_blank">machine learning</a>, <a href="{{site.docs-stable}}/dev/libs/gelly/index.html" target="_blank">graph processing</a>, and <a href="{{site.docs-stable}}/dev/libs/storm_compatibility.html" target="_blank">Apache Storm compatibility</a>.
 
 ## Flink and other frameworks
 
@@ -120,6 +120,6 @@ If you\u2019re interested in learning more, we\u2019ve collected [information about th
 
 ## Key Takeaways and Next Steps
 
-In summary, Apache Flink is an open-source stream processing framework that eliminates the "performance vs. reliability" tradeoff often associated with open-source streaming engines and performs consistently in both categories. Following this introduction, we recommend you try our <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/quickstart/setup_quickstart.html" target="_blank">quickstart</a>, [download]({{ site.baseurl }}/downloads.html) the most recent stable version of Flink, or review the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/index.html" target="_blank">documentation</a>.
+In summary, Apache Flink is an open-source stream processing framework that eliminates the "performance vs. reliability" tradeoff often associated with open-source streaming engines and performs consistently in both categories. Following this introduction, we recommend you try our <a href="{{site.docs-stable}}/quickstart/setup_quickstart.html" target="_blank">quickstart</a>, [download]({{ site.baseurl }}/downloads.html) the most recent stable version of Flink, or review the <a href="{{site.docs-stable}}/index.html" target="_blank">documentation</a>.
 
 And we encourage you to join the Flink user mailing list and to share your questions with the community. We\u2019re here to help you get the most out of Flink.


[2/4] flink-web git commit: Update Connectors section on 'Ecosystem' page to point to 1.2 docs

Posted by uc...@apache.org.
Update Connectors section on 'Ecosystem' page to point to 1.2 docs


Project: http://git-wip-us.apache.org/repos/asf/flink-web/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink-web/commit/6b8c0172
Tree: http://git-wip-us.apache.org/repos/asf/flink-web/tree/6b8c0172
Diff: http://git-wip-us.apache.org/repos/asf/flink-web/diff/6b8c0172

Branch: refs/heads/asf-site
Commit: 6b8c01728f7806f85d6ba6c20043b5ca9d3f20b0
Parents: 6ea1a54
Author: wints <mw...@gmail.com>
Authored: Fri Feb 10 15:34:00 2017 +0100
Committer: Ufuk Celebi <uc...@apache.org>
Committed: Mon Feb 20 15:44:16 2017 +0100

----------------------------------------------------------------------
 ecosystem.md | 22 +++++++++++-----------
 1 file changed, 11 insertions(+), 11 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/flink-web/blob/6b8c0172/ecosystem.md
----------------------------------------------------------------------
diff --git a/ecosystem.md b/ecosystem.md
index e29d3f2..ce21fd2 100644
--- a/ecosystem.md
+++ b/ecosystem.md
@@ -14,16 +14,16 @@ many other data processing projects and frameworks.
 <p>Currently these systems are supported:</p>
 
 <ul>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/kafka.html" target="_blank">Apache Kafka</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/elasticsearch.html" target="_blank">Elasticsearch</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/elasticsearch2.html" target="_blank">Elasticsearch 2x</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/filesystem_sink.html" target="_blank">HDFS</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/rabbitmq.html" target="_blank">RabbitMQ</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/kinesis.html" target="_blank">Amazon Kinesis Streams</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/twitter.html" target="_blank">Twitter Streaming API</a> (source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/nifi.html" target="_blank">Apache NiFi</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/cassandra.html" target="_blank">Apache Cassandra</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/redis.html" target="_blank">Redis</a> (sink)</li>
+  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/kafka.html" target="_blank">Apache Kafka</a> (sink/source)</li>
+  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/elasticsearch.html" target="_blank">Elasticsearch</a> (sink)</li>
+  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/elasticsearch2.html" target="_blank">Elasticsearch 2.x</a> (sink)</li>
+  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/filesystem_sink.html" target="_blank">HDFS</a> (sink)</li>
+  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/rabbitmq.html" target="_blank">RabbitMQ</a> (sink/source)</li>
+  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/kinesis.html" target="_blank">Amazon Kinesis Streams</a> (sink/source)</li>
+  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/twitter.html" target="_blank">Twitter</a> (source)</li>
+  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/nifi.html" target="_blank">Apache NiFi</a> (sink/source)</li>
+  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/cassandra.html" target="_blank">Apache Cassandra</a> (sink)</li>
+  <li><a href="https://github.com/apache/bahir-flink" target="_blank">Redis, Flume, and ActiveMQ (via Apache Bahir)</a> (sink)</li>
 </ul>
 
 To run an application using one of these connectors, additional third party
@@ -42,7 +42,7 @@ Please let us know on the [user/dev mailing list](#mailing-lists).
 
 **Apache Zeppelin**
 
-[Apache Zeppelin (incubator)](https://zeppelin.incubator.apache.org/) is a web-based notebook that enables interactive data analytics and can be used with
+[Apache Zeppelin](https://zeppelin.incubator.apache.org/) is a web-based notebook that enables interactive data analytics and can be used with
 [Flink as an execution engine](https://zeppelin.incubator.apache.org/docs/interpreter/flink.html) (next to others engines).
 See also Jim Dowling's [Flink Forward talk](http://www.slideshare.net/FlinkForward/jim-dowling-interactive-flink-analytics-with-hopsworks-and-zeppelin) about Zeppelin on Flink.
 


[3/4] flink-web git commit: Update links in 'Introduction' page and add 1.2.0 to All Releases on 'Downloads' page

Posted by uc...@apache.org.
Update links in 'Introduction' page  and add 1.2.0 to All Releases on 'Downloads' page


Project: http://git-wip-us.apache.org/repos/asf/flink-web/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink-web/commit/6ea1a54a
Tree: http://git-wip-us.apache.org/repos/asf/flink-web/tree/6ea1a54a
Diff: http://git-wip-us.apache.org/repos/asf/flink-web/diff/6ea1a54a

Branch: refs/heads/asf-site
Commit: 6ea1a54ab07f17cfd686ce47fcde560feebb023d
Parents: 2764a51
Author: wints <mw...@gmail.com>
Authored: Fri Feb 10 15:02:20 2017 +0100
Committer: Ufuk Celebi <uc...@apache.org>
Committed: Mon Feb 20 15:44:16 2017 +0100

----------------------------------------------------------------------
 downloads.md    |  1 +
 introduction.md | 14 +++++++-------
 2 files changed, 8 insertions(+), 7 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/flink-web/blob/6ea1a54a/downloads.md
----------------------------------------------------------------------
diff --git a/downloads.md b/downloads.md
index e47107e..0e6cce7 100644
--- a/downloads.md
+++ b/downloads.md
@@ -99,6 +99,7 @@ You can add the following dependencies to your `pom.xml` to include Apache Flink
 
 ## All releases
 
+- Flink 1.2.0 - 2017-02-06 ([Source](http://www.apache.org/dyn/closer.lua/flink/flink-1.2.0/flink-1.2.0-src.tgz), [Binaries](http://archive.apache.org/dist/flink/flink-1.2.0/), [Docs]({{site.DOCS_BASE_URL}}flink-docs-release-1.2/), [Javadocs]({{site.DOCS_BASE_URL}}flink-docs-release-1.2/api/java), [ScalaDocs]({{site.DOCS_BASE_URL}}flink-docs-release-1.2/api/scala/index.html))
 - Flink 1.1.4 - 2016-12-21 ([Source](http://archive.apache.org/dist/flink/flink-1.1.4/flink-1.1.4-src.tgz), [Binaries](http://archive.apache.org/dist/flink/flink-1.1.4/), [Docs]({{site.DOCS_BASE_URL}}flink-docs-release-1.1/), [Javadocs]({{site.DOCS_BASE_URL}}flink-docs-release-1.1/api/java), [ScalaDocs]({{site.DOCS_BASE_URL}}flink-docs-release-1.1/api/scala/index.html))
 - Flink 1.1.3 - 2016-10-13 ([Source](http://archive.apache.org/dist/flink/flink-1.1.3/flink-1.1.3-src.tgz), [Binaries](http://archive.apache.org/dist/flink/flink-1.1.3/), [Docs]({{site.DOCS_BASE_URL}}flink-docs-release-1.1/), [Javadocs]({{site.DOCS_BASE_URL}}flink-docs-release-1.1/api/java), [ScalaDocs]({{site.DOCS_BASE_URL}}flink-docs-release-1.1/api/scala/index.html))
 - Flink 1.1.2 - 2016-09-05 ([Source](http://archive.apache.org/dist/flink/flink-1.1.2/flink-1.1.2-src.tgz), [Binaries](http://archive.apache.org/dist/flink/flink-1.1.2/))

http://git-wip-us.apache.org/repos/asf/flink-web/blob/6ea1a54a/introduction.md
----------------------------------------------------------------------
diff --git a/introduction.md b/introduction.md
index a1094ca..2011f87 100644
--- a/introduction.md
+++ b/introduction.md
@@ -2,7 +2,7 @@
 title: "Introduction to Apache Flink�"
 ---
 <br>
-Below is a high-level overview of Apache Flink and stream processing. For a more technical introduction, we recommend the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/concepts/concepts.html" target="_blank">"Concepts" page</a> in the Flink documentation.
+Below is a high-level overview of Apache Flink and stream processing. For a more technical introduction, we recommend the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/concepts/programming-model.html" target="_blank">"Concepts" page</a> in the Flink documentation.
 <br>
 {% toc %}
 
@@ -96,13 +96,13 @@ Flink\u2019s core is a distributed streaming dataflow engine, meaning that data is
 
 ### APIs
 
-+ Flink\u2019s <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/index.html" target="_blank">DataStream API</a> is for programs that implement transformations on data streams (e.g., filtering, updating state, defining windows, aggregating).
-+ The <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/batch/index.html" target="_blank">DataSet API</a> is for programs that implement transformations on data sets (e.g., filtering, mapping, joining, grouping).
-+ The <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/table.html" target="_blank">Table API</a> is a SQL-like expression language for relational stream and batch processing that can be easily embedded in Flink\u2019s DataSet and DataStream APIs (Java and Scala).
-+ <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/table.html#sql" target="_blank">Streaming SQL</a> enables SQL queries to be executed on streaming and batch tables. The syntax is based on <a href="https://calcite.apache.org/docs/stream.html" target="_blank">Apache Calcite\u2122</a>.
++ Flink\u2019s <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/datastream_api.html" target="_blank">DataStream API</a> is for programs that implement transformations on data streams (e.g., filtering, updating state, defining windows, aggregating).
++ The <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/batch/index.html" target="_blank">DataSet API</a> is for programs that implement transformations on data sets (e.g., filtering, mapping, joining, grouping).
++ The <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/table_api.html#table-api" target="_blank">Table API</a> is a SQL-like expression language for relational stream and batch processing that can be easily embedded in Flink\u2019s DataSet and DataStream APIs (Java and Scala).
++ <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/table_api.html#sql" target="_blank">Streaming SQL</a> enables SQL queries to be executed on streaming and batch tables. The syntax is based on <a href="https://calcite.apache.org/docs/stream.html" target="_blank">Apache Calcite\u2122</a>.
 
 ### Libraries
-Flink also includes special-purpose libraries for <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/libs/cep.html" target="_blank">complex event processing</a>, <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/batch/libs/ml/index.html" target="_blank">machine learning</a>, <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/batch/libs/gelly.html" target="_blank">graph processing</a>, and <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/storm_compatibility.html" target="_blank">Apache Storm compatibility</a>.
+Flink also includes special-purpose libraries for <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/cep.html" target="_blank">complex event processing</a>, <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/ml/index.html" target="_blank">machine learning</a>, <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/gelly/index.html" target="_blank">graph processing</a>, and <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/storm_compatibility.html" target="_blank">Apache Storm compatibility</a>.
 
 ## Flink and other frameworks
 
@@ -120,6 +120,6 @@ If you\u2019re interested in learning more, we\u2019ve collected [information about th
 
 ## Key Takeaways and Next Steps
 
-In summary, Apache Flink is an open-source stream processing framework that eliminates the "performance vs. reliability" tradeoff often associated with open-source streaming engines and performs consistently in both categories. Following this introduction, we recommend you try our <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/quickstart/setup_quickstart.html" target="_blank">quickstart</a>, [download]({{ site.baseurl }}/downloads.html) the most recent stable version of Flink, or review the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/" target="_blank">documentation</a>.
+In summary, Apache Flink is an open-source stream processing framework that eliminates the "performance vs. reliability" tradeoff often associated with open-source streaming engines and performs consistently in both categories. Following this introduction, we recommend you try our <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/quickstart/setup_quickstart.html" target="_blank">quickstart</a>, [download]({{ site.baseurl }}/downloads.html) the most recent stable version of Flink, or review the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/index.html" target="_blank">documentation</a>.
 
 And we encourage you to join the Flink user mailing list and to share your questions with the community. We\u2019re here to help you get the most out of Flink.


[4/4] flink-web git commit: Rebuild website

Posted by uc...@apache.org.
Rebuild website

This closes #46.


Project: http://git-wip-us.apache.org/repos/asf/flink-web/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink-web/commit/5926684a
Tree: http://git-wip-us.apache.org/repos/asf/flink-web/tree/5926684a
Diff: http://git-wip-us.apache.org/repos/asf/flink-web/diff/5926684a

Branch: refs/heads/asf-site
Commit: 5926684a38dfc7bb412aea5dfc575d26ae2a481b
Parents: 346e66e
Author: Ufuk Celebi <uc...@apache.org>
Authored: Mon Feb 20 15:43:43 2017 +0100
Committer: Ufuk Celebi <uc...@apache.org>
Committed: Mon Feb 20 15:44:33 2017 +0100

----------------------------------------------------------------------
 content/blog/feed.xml     | 82 +++++++++++++++++++++---------------------
 content/downloads.html    |  1 +
 content/ecosystem.html    | 22 ++++++------
 content/introduction.html | 14 ++++----
 4 files changed, 60 insertions(+), 59 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/flink-web/blob/5926684a/content/blog/feed.xml
----------------------------------------------------------------------
diff --git a/content/blog/feed.xml b/content/blog/feed.xml
index e73452a..bb4de75 100644
--- a/content/blog/feed.xml
+++ b/content/blog/feed.xml
@@ -275,7 +275,7 @@ If you have, for example, a flatMap() operator that keeps a running aggregate pe
   &lt;li&gt;\u9b4f\u5049\u54f2&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Mon, 06 Feb 2017 20:00:00 +0800</pubDate>
+<pubDate>Mon, 06 Feb 2017 13:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2017/02/06/release-1.2.0.html</link>
 <guid isPermaLink="true">/news/2017/02/06/release-1.2.0.html</guid>
 </item>
@@ -490,7 +490,7 @@ If you have, for example, a flatMap() operator that keeps a running aggregate pe
 &lt;/ul&gt;
 
 </description>
-<pubDate>Wed, 21 Dec 2016 17:00:00 +0800</pubDate>
+<pubDate>Wed, 21 Dec 2016 10:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2016/12/21/release-1.1.4.html</link>
 <guid isPermaLink="true">/news/2016/12/21/release-1.1.4.html</guid>
 </item>
@@ -684,7 +684,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
 
 &lt;p&gt;Lastly, we\u2019d like to extend a sincere thank you to all of the Flink community for making 2016 a great year!&lt;/p&gt;
 </description>
-<pubDate>Mon, 19 Dec 2016 17:00:00 +0800</pubDate>
+<pubDate>Mon, 19 Dec 2016 10:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2016/12/19/2016-year-in-review.html</link>
 <guid isPermaLink="true">/news/2016/12/19/2016-year-in-review.html</guid>
 </item>
@@ -788,7 +788,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
 &lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Wed, 12 Oct 2016 17:00:00 +0800</pubDate>
+<pubDate>Wed, 12 Oct 2016 11:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/10/12/release-1.1.3.html</link>
 <guid isPermaLink="true">/news/2016/10/12/release-1.1.3.html</guid>
 </item>
@@ -862,7 +862,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
 &lt;/ul&gt;
 
 </description>
-<pubDate>Mon, 05 Sep 2016 17:00:00 +0800</pubDate>
+<pubDate>Mon, 05 Sep 2016 11:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/09/05/release-1.1.2.html</link>
 <guid isPermaLink="true">/news/2016/09/05/release-1.1.2.html</guid>
 </item>
@@ -882,7 +882,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
 &lt;p&gt;We hope to see many community members at Flink Forward 2016. Registration is available online: &lt;a href=&quot;http://flink-forward.org/registration/&quot;&gt;flink-forward.org/registration&lt;/a&gt;
 &lt;/p&gt;
 </description>
-<pubDate>Wed, 24 Aug 2016 17:00:00 +0800</pubDate>
+<pubDate>Wed, 24 Aug 2016 11:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/08/24/ff16-keynotes-panels.html</link>
 <guid isPermaLink="true">/news/2016/08/24/ff16-keynotes-panels.html</guid>
 </item>
@@ -913,7 +913,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
 
 &lt;p&gt;You can find the binaries on the updated &lt;a href=&quot;http://flink.apache.org/downloads.html&quot;&gt;Downloads page&lt;/a&gt;.&lt;/p&gt;
 </description>
-<pubDate>Thu, 11 Aug 2016 17:00:00 +0800</pubDate>
+<pubDate>Thu, 11 Aug 2016 11:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/08/11/release-1.1.1.html</link>
 <guid isPermaLink="true">/news/2016/08/11/release-1.1.1.html</guid>
 </item>
@@ -1135,7 +1135,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
   &lt;li&gt;\u536b\u4e50&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Mon, 08 Aug 2016 21:00:00 +0800</pubDate>
+<pubDate>Mon, 08 Aug 2016 15:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/08/08/release-1.1.0.html</link>
 <guid isPermaLink="true">/news/2016/08/08/release-1.1.0.html</guid>
 </item>
@@ -1264,7 +1264,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
 
 &lt;p&gt;If this post made you curious and you want to try out Flink\u2019s SQL interface and the new Table API, we encourage you to do so! Simply clone the SNAPSHOT &lt;a href=&quot;https://github.com/apache/flink/tree/master&quot;&gt;master branch&lt;/a&gt; and check out the &lt;a href=&quot;https://ci.apache.org/projects/flink/flink-docs-master/apis/table.html&quot;&gt;Table API documentation for the SNAPSHOT version&lt;/a&gt;. Please note that the branch is under heavy development, and hence some code examples in this blog post might not work. We are looking forward to your feedback and welcome contributions.&lt;/p&gt;
 </description>
-<pubDate>Tue, 24 May 2016 18:00:00 +0800</pubDate>
+<pubDate>Tue, 24 May 2016 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/05/24/stream-sql.html</link>
 <guid isPermaLink="true">/news/2016/05/24/stream-sql.html</guid>
 </item>
@@ -1308,7 +1308,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
   &lt;li&gt;[streaming-contrib] Fix port clash in DbStateBackend tests&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Wed, 11 May 2016 16:00:00 +0800</pubDate>
+<pubDate>Wed, 11 May 2016 10:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/05/11/release-1.0.3.html</link>
 <guid isPermaLink="true">/news/2016/05/11/release-1.0.3.html</guid>
 </item>
@@ -1354,7 +1354,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
   &lt;li&gt;[&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-3716&quot;&gt;FLINK-3716&lt;/a&gt;] [kafka consumer] Decreasing socket timeout so testFailOnNoBroker() will pass before JUnit timeout&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Fri, 22 Apr 2016 16:00:00 +0800</pubDate>
+<pubDate>Fri, 22 Apr 2016 10:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/04/22/release-1.0.2.html</link>
 <guid isPermaLink="true">/news/2016/04/22/release-1.0.2.html</guid>
 </item>
@@ -1367,7 +1367,7 @@ enable the joining of a main, high-throughput stream with one more more inputs w
 
 &lt;p&gt;Read more &lt;a href=&quot;http://flink-forward.org/&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;
 </description>
-<pubDate>Thu, 14 Apr 2016 18:00:00 +0800</pubDate>
+<pubDate>Thu, 14 Apr 2016 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/04/14/flink-forward-announce.html</link>
 <guid isPermaLink="true">/news/2016/04/14/flink-forward-announce.html</guid>
 </item>
@@ -1560,7 +1560,7 @@ This feature will allow to prune unpromising event sequences early.&lt;/p&gt;
 &lt;p&gt;&lt;em&gt;Note:&lt;/em&gt; The example code requires Flink 1.0.1 or higher.&lt;/p&gt;
 
 </description>
-<pubDate>Wed, 06 Apr 2016 18:00:00 +0800</pubDate>
+<pubDate>Wed, 06 Apr 2016 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/04/06/cep-monitoring.html</link>
 <guid isPermaLink="true">/news/2016/04/06/cep-monitoring.html</guid>
 </item>
@@ -1631,7 +1631,7 @@ This feature will allow to prune unpromising event sequences early.&lt;/p&gt;
 &lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Wed, 06 Apr 2016 16:00:00 +0800</pubDate>
+<pubDate>Wed, 06 Apr 2016 10:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2016/04/06/release-1.0.1.html</link>
 <guid isPermaLink="true">/news/2016/04/06/release-1.0.1.html</guid>
 </item>
@@ -1758,7 +1758,7 @@ When using this backend, active state in streaming programs can grow well beyond
   &lt;li&gt;zhangminglei&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Tue, 08 Mar 2016 21:00:00 +0800</pubDate>
+<pubDate>Tue, 08 Mar 2016 14:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2016/03/08/release-1.0.0.html</link>
 <guid isPermaLink="true">/news/2016/03/08/release-1.0.0.html</guid>
 </item>
@@ -1795,7 +1795,7 @@ When using this backend, active state in streaming programs can grow well beyond
   &lt;li&gt;&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-3020&quot;&gt;FLINK-3020&lt;/a&gt;: Set number of task slots to maximum parallelism in local execution&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Thu, 11 Feb 2016 16:00:00 +0800</pubDate>
+<pubDate>Thu, 11 Feb 2016 09:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2016/02/11/release-0.10.2.html</link>
 <guid isPermaLink="true">/news/2016/02/11/release-0.10.2.html</guid>
 </item>
@@ -2019,7 +2019,7 @@ discussion&lt;/a&gt;
 on the Flink mailing lists.&lt;/p&gt;
 
 </description>
-<pubDate>Fri, 18 Dec 2015 18:00:00 +0800</pubDate>
+<pubDate>Fri, 18 Dec 2015 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/12/18/a-year-in-review.html</link>
 <guid isPermaLink="true">/news/2015/12/18/a-year-in-review.html</guid>
 </item>
@@ -2166,7 +2166,7 @@ While you can embed Spouts/Bolts in a Flink program and mix-and-match them with
 &lt;p&gt;&lt;sup id=&quot;fn1&quot;&gt;1. We confess, there are three lines changed compared to a Storm project &lt;img class=&quot;emoji&quot; style=&quot;width:16px;height:16px;align:absmiddle&quot; src=&quot;/img/blog/smirk.png&quot; /&gt;\u2014because the example covers local &lt;em&gt;and&lt;/em&gt; remote execution. &lt;a href=&quot;#ref1&quot; title=&quot;Back to text.&quot;&gt;\u21a9&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
 
 </description>
-<pubDate>Fri, 11 Dec 2015 18:00:00 +0800</pubDate>
+<pubDate>Fri, 11 Dec 2015 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/12/11/storm-compatibility.html</link>
 <guid isPermaLink="true">/news/2015/12/11/storm-compatibility.html</guid>
 </item>
@@ -2323,7 +2323,7 @@ While you can embed Spouts/Bolts in a Flink program and mix-and-match them with
 
 &lt;p&gt;Support for various types of windows over continuous data streams is a must-have for modern stream processors. Apache Flink is a stream processor with a very strong feature set, including a very flexible mechanism to build and evaluate windows over continuous data streams. Flink provides pre-defined window operators for common uses cases as well as a toolbox that allows to define very custom windowing logic. The Flink community will add more pre-defined window operators as we learn the requirements from our users.&lt;/p&gt;
 </description>
-<pubDate>Fri, 04 Dec 2015 18:00:00 +0800</pubDate>
+<pubDate>Fri, 04 Dec 2015 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/12/04/Introducing-windows.html</link>
 <guid isPermaLink="true">/news/2015/12/04/Introducing-windows.html</guid>
 </item>
@@ -2382,7 +2382,7 @@ While you can embed Spouts/Bolts in a Flink program and mix-and-match them with
 &lt;/ul&gt;
 
 </description>
-<pubDate>Fri, 27 Nov 2015 16:00:00 +0800</pubDate>
+<pubDate>Fri, 27 Nov 2015 09:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/11/27/release-0.10.1.html</link>
 <guid isPermaLink="true">/news/2015/11/27/release-0.10.1.html</guid>
 </item>
@@ -2557,7 +2557,7 @@ Also note that some methods in the DataStream API had to be renamed as part of t
 &lt;/ul&gt;
 
 </description>
-<pubDate>Mon, 16 Nov 2015 16:00:00 +0800</pubDate>
+<pubDate>Mon, 16 Nov 2015 09:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/11/16/release-0.10.0.html</link>
 <guid isPermaLink="true">/news/2015/11/16/release-0.10.0.html</guid>
 </item>
@@ -3448,7 +3448,7 @@ Either &lt;code&gt;0 + absolutePointer&lt;/code&gt; or &lt;code&gt;objectRefAddr
 &lt;/div&gt;
 
 </description>
-<pubDate>Wed, 16 Sep 2015 16:00:00 +0800</pubDate>
+<pubDate>Wed, 16 Sep 2015 10:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2015/09/16/off-heap-memory.html</link>
 <guid isPermaLink="true">/news/2015/09/16/off-heap-memory.html</guid>
 </item>
@@ -3500,7 +3500,7 @@ fault tolerance, the internal runtime architecture, and others.&lt;/p&gt;
 register for the conference.&lt;/p&gt;
 
 </description>
-<pubDate>Thu, 03 Sep 2015 16:00:00 +0800</pubDate>
+<pubDate>Thu, 03 Sep 2015 10:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2015/09/03/flink-forward.html</link>
 <guid isPermaLink="true">/news/2015/09/03/flink-forward.html</guid>
 </item>
@@ -3561,7 +3561,7 @@ for this release:&lt;/p&gt;
   &lt;li&gt;&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-2584&quot;&gt;FLINK-2584&lt;/a&gt; ASM dependency is not shaded away&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Tue, 01 Sep 2015 16:00:00 +0800</pubDate>
+<pubDate>Tue, 01 Sep 2015 10:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2015/09/01/release-0.9.1.html</link>
 <guid isPermaLink="true">/news/2015/09/01/release-0.9.1.html</guid>
 </item>
@@ -4018,7 +4018,7 @@ tools, graph database systems and sampling techniques.&lt;/p&gt;
 &lt;h2 id=&quot;links&quot;&gt;Links&lt;/h2&gt;
 &lt;p&gt;&lt;a href=&quot;https://ci.apache.org/projects/flink/flink-docs-master/libs/gelly_guide.html&quot;&gt;Gelly Documentation&lt;/a&gt;&lt;/p&gt;
 </description>
-<pubDate>Mon, 24 Aug 2015 00:00:00 +0800</pubDate>
+<pubDate>Mon, 24 Aug 2015 00:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2015/08/24/introducing-flink-gelly.html</link>
 <guid isPermaLink="true">/news/2015/08/24/introducing-flink-gelly.html</guid>
 </item>
@@ -4257,7 +4257,7 @@ tools, graph database systems and sampling techniques.&lt;/p&gt;
 
 &lt;p&gt;Flink will require at least Java 7 in major releases after 0.9.0.&lt;/p&gt;
 </description>
-<pubDate>Wed, 24 Jun 2015 22:00:00 +0800</pubDate>
+<pubDate>Wed, 24 Jun 2015 16:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2015/06/24/announcing-apache-flink-0.9.0-release.html</link>
 <guid isPermaLink="true">/news/2015/06/24/announcing-apache-flink-0.9.0-release.html</guid>
 </item>
@@ -4296,7 +4296,7 @@ including Apache Flink.&lt;/p&gt;
 
 &lt;p&gt;Stay tuned for a wealth of upcoming events! Two Flink talsk will be presented at &lt;a href=&quot;http://berlinbuzzwords.de/15/sessions&quot;&gt;Berlin Buzzwords&lt;/a&gt;, Flink will be presented at the &lt;a href=&quot;http://2015.hadoopsummit.org/san-jose/&quot;&gt;Hadoop Summit in San Jose&lt;/a&gt;. A &lt;a href=&quot;http://www.meetup.com/Apache-Flink-Meetup/events/220557545/&quot;&gt;training workshop on Apache Flink&lt;/a&gt; is being organized in Berlin. Finally, &lt;a href=&quot;http://2015.flink-forward.org/&quot;&gt;Flink Forward&lt;/a&gt;, the first conference to bring together the whole Flink community is taking place in Berlin in October 2015.&lt;/p&gt;
 </description>
-<pubDate>Thu, 14 May 2015 18:00:00 +0800</pubDate>
+<pubDate>Thu, 14 May 2015 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2015/05/14/Community-update-April.html</link>
 <guid isPermaLink="true">/news/2015/05/14/Community-update-April.html</guid>
 </item>
@@ -4487,7 +4487,7 @@ The following figure shows how two objects are compared.&lt;/p&gt;
   &lt;li&gt;Flink\u2019s DBMS-style operators operate natively on binary data yielding high performance in-memory and destage gracefully to disk if necessary.&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Mon, 11 May 2015 18:00:00 +0800</pubDate>
+<pubDate>Mon, 11 May 2015 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2015/05/11/Juggling-with-Bits-and-Bytes.html</link>
 <guid isPermaLink="true">/news/2015/05/11/Juggling-with-Bits-and-Bytes.html</guid>
 </item>
@@ -4739,7 +4739,7 @@ Improve usability of command line interface&lt;/p&gt;
   &lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Mon, 13 Apr 2015 18:00:00 +0800</pubDate>
+<pubDate>Mon, 13 Apr 2015 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2015/04/13/release-0.9.0-milestone1.html</link>
 <guid isPermaLink="true">/news/2015/04/13/release-0.9.0-milestone1.html</guid>
 </item>
@@ -4806,7 +4806,7 @@ limited in that it does not yet handle large state and iterative
 programs.&lt;/p&gt;
 
 </description>
-<pubDate>Tue, 07 Apr 2015 18:00:00 +0800</pubDate>
+<pubDate>Tue, 07 Apr 2015 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2015/04/07/march-in-flink.html</link>
 <guid isPermaLink="true">/news/2015/04/07/march-in-flink.html</guid>
 </item>
@@ -4993,7 +4993,7 @@ programs.&lt;/p&gt;
 [4] &lt;a href=&quot;https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/batch/index.html#semantic-annotations&quot;&gt;Flink 1.0 documentation: Semantic annotations&lt;/a&gt; &lt;br /&gt;
 [5] &lt;a href=&quot;https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/batch/dataset_transformations.html#join-algorithm-hints&quot;&gt;Flink 1.0 documentation: Optimizer join hints&lt;/a&gt; &lt;br /&gt;&lt;/p&gt;
 </description>
-<pubDate>Fri, 13 Mar 2015 18:00:00 +0800</pubDate>
+<pubDate>Fri, 13 Mar 2015 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/03/13/peeking-into-Apache-Flinks-Engine-Room.html</link>
 <guid isPermaLink="true">/news/2015/03/13/peeking-into-Apache-Flinks-Engine-Room.html</guid>
 </item>
@@ -5109,7 +5109,7 @@ Hadoop clusters.  Also, basic support for accessing secured HDFS with
 a standalone Flink setup is now available.&lt;/p&gt;
 
 </description>
-<pubDate>Mon, 02 Mar 2015 18:00:00 +0800</pubDate>
+<pubDate>Mon, 02 Mar 2015 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/03/02/february-2015-in-flink.html</link>
 <guid isPermaLink="true">/news/2015/03/02/february-2015-in-flink.html</guid>
 </item>
@@ -5754,7 +5754,7 @@ internally, fault tolerance, and performance measurements!&lt;/p&gt;
 
 &lt;p&gt;&lt;a href=&quot;#top&quot;&gt;Back to top&lt;/a&gt;&lt;/p&gt;
 </description>
-<pubDate>Mon, 09 Feb 2015 20:00:00 +0800</pubDate>
+<pubDate>Mon, 09 Feb 2015 13:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/02/09/streaming-example.html</link>
 <guid isPermaLink="true">/news/2015/02/09/streaming-example.html</guid>
 </item>
@@ -5803,7 +5803,7 @@ internally, fault tolerance, and performance measurements!&lt;/p&gt;
 
 &lt;p&gt;The improved YARN client of Flink now allows users to deploy Flink on YARN for executing a single job. Older versions only supported a long-running YARN session. The code of the YARN client has been refactored to provide an (internal) Java API for controlling YARN clusters more easily.&lt;/p&gt;
 </description>
-<pubDate>Wed, 04 Feb 2015 18:00:00 +0800</pubDate>
+<pubDate>Wed, 04 Feb 2015 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/02/04/january-in-flink.html</link>
 <guid isPermaLink="true">/news/2015/02/04/january-in-flink.html</guid>
 </item>
@@ -5889,7 +5889,7 @@ internally, fault tolerance, and performance measurements!&lt;/p&gt;
   &lt;li&gt;Chen Xu&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Wed, 21 Jan 2015 18:00:00 +0800</pubDate>
+<pubDate>Wed, 21 Jan 2015 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/01/21/release-0.8.html</link>
 <guid isPermaLink="true">/news/2015/01/21/release-0.8.html</guid>
 </item>
@@ -5951,7 +5951,7 @@ Flink serialization system improved a lot over time and by now surpasses the cap
 
 &lt;p&gt;The community is working hard together with the Apache infra team to migrate the Flink infrastructure to a top-level project. At the same time, the Flink community is working on the Flink 0.8.0 release which should be out very soon.&lt;/p&gt;
 </description>
-<pubDate>Tue, 06 Jan 2015 18:00:00 +0800</pubDate>
+<pubDate>Tue, 06 Jan 2015 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2015/01/06/december-in-flink.html</link>
 <guid isPermaLink="true">/news/2015/01/06/december-in-flink.html</guid>
 </item>
@@ -6038,7 +6038,7 @@ Flink serialization system improved a lot over time and by now surpasses the cap
 
 &lt;p&gt;If you want to use Flink\u2019s Hadoop compatibility package checkout our &lt;a href=&quot;https://ci.apache.org/projects/flink/flink-docs-master/apis/batch/hadoop_compatibility.html&quot;&gt;documentation&lt;/a&gt;.&lt;/p&gt;
 </description>
-<pubDate>Tue, 18 Nov 2014 18:00:00 +0800</pubDate>
+<pubDate>Tue, 18 Nov 2014 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2014/11/18/hadoop-compatibility.html</link>
 <guid isPermaLink="true">/news/2014/11/18/hadoop-compatibility.html</guid>
 </item>
@@ -6110,7 +6110,7 @@ Flink serialization system improved a lot over time and by now surpasses the cap
   &lt;li&gt;Yingjun Wu&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Tue, 04 Nov 2014 18:00:00 +0800</pubDate>
+<pubDate>Tue, 04 Nov 2014 11:00:00 +0100</pubDate>
 <link>http://flink.apache.org/news/2014/11/04/release-0.7.0.html</link>
 <guid isPermaLink="true">/news/2014/11/04/release-0.7.0.html</guid>
 </item>
@@ -6209,7 +6209,7 @@ properties, some algorithms)&lt;/p&gt;
 &lt;p&gt;http://www.meetup.com/HandsOnProgrammingEvents/events/210504392/&lt;/p&gt;
 
 </description>
-<pubDate>Fri, 03 Oct 2014 18:00:00 +0800</pubDate>
+<pubDate>Fri, 03 Oct 2014 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2014/10/03/upcoming_events.html</link>
 <guid isPermaLink="true">/news/2014/10/03/upcoming_events.html</guid>
 </item>
@@ -6223,7 +6223,7 @@ of the system. We suggest all users of Flink to work with this newest version.&l
 
 &lt;p&gt;&lt;a href=&quot;/downloads.html&quot;&gt;Download&lt;/a&gt; the release today.&lt;/p&gt;
 </description>
-<pubDate>Fri, 26 Sep 2014 18:00:00 +0800</pubDate>
+<pubDate>Fri, 26 Sep 2014 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2014/09/26/release-0.6.1.html</link>
 <guid isPermaLink="true">/news/2014/09/26/release-0.6.1.html</guid>
 </item>
@@ -6306,7 +6306,7 @@ robust, as well as breaking API changes.&lt;/p&gt;
   &lt;li&gt;Tobias Wiens&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-<pubDate>Tue, 26 Aug 2014 18:00:00 +0800</pubDate>
+<pubDate>Tue, 26 Aug 2014 12:00:00 +0200</pubDate>
 <link>http://flink.apache.org/news/2014/08/26/release-0.6.html</link>
 <guid isPermaLink="true">/news/2014/08/26/release-0.6.html</guid>
 </item>

http://git-wip-us.apache.org/repos/asf/flink-web/blob/5926684a/content/downloads.html
----------------------------------------------------------------------
diff --git a/content/downloads.html b/content/downloads.html
index 76c309b..515e452 100644
--- a/content/downloads.html
+++ b/content/downloads.html
@@ -251,6 +251,7 @@ pick the Hadoop 1 version.</p>
 <h2 id="all-releases">All releases</h2>
 
 <ul>
+  <li>Flink 1.2.0 - 2017-02-06 (<a href="http://www.apache.org/dyn/closer.lua/flink/flink-1.2.0/flink-1.2.0-src.tgz">Source</a>, <a href="http://archive.apache.org/dist/flink/flink-1.2.0/">Binaries</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/">Docs</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/api/java">Javadocs</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/api/scala/index.html">ScalaDocs</a>)</li>
   <li>Flink 1.1.4 - 2016-12-21 (<a href="http://archive.apache.org/dist/flink/flink-1.1.4/flink-1.1.4-src.tgz">Source</a>, <a href="http://archive.apache.org/dist/flink/flink-1.1.4/">Binaries</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.1/">Docs</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.1/api/java">Javadocs</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.1/api/scala/index.html">ScalaDocs</a>)</li>
   <li>Flink 1.1.3 - 2016-10-13 (<a href="http://archive.apache.org/dist/flink/flink-1.1.3/flink-1.1.3-src.tgz">Source</a>, <a href="http://archive.apache.org/dist/flink/flink-1.1.3/">Binaries</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.1/">Docs</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.1/api/java">Javadocs</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.1/api/scala/index.html">ScalaDocs</a>)</li>
   <li>Flink 1.1.2 - 2016-09-05 (<a href="http://archive.apache.org/dist/flink/flink-1.1.2/flink-1.1.2-src.tgz">Source</a>, <a href="http://archive.apache.org/dist/flink/flink-1.1.2/">Binaries</a>)</li>

http://git-wip-us.apache.org/repos/asf/flink-web/blob/5926684a/content/ecosystem.html
----------------------------------------------------------------------
diff --git a/content/ecosystem.html b/content/ecosystem.html
index 3c08cb4..577191f 100644
--- a/content/ecosystem.html
+++ b/content/ecosystem.html
@@ -161,16 +161,16 @@ many other data processing projects and frameworks.
 <p>Currently these systems are supported:</p>
 
 <ul>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/kafka.html" target="_blank">Apache Kafka</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/elasticsearch.html" target="_blank">Elasticsearch</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/elasticsearch2.html" target="_blank">Elasticsearch 2x</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/filesystem_sink.html" target="_blank">HDFS</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/rabbitmq.html" target="_blank">RabbitMQ</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/kinesis.html" target="_blank">Amazon Kinesis Streams</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/twitter.html" target="_blank">Twitter Streaming API</a> (source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/nifi.html" target="_blank">Apache NiFi</a> (sink/source)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/cassandra.html" target="_blank">Apache Cassandra</a> (sink)</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/connectors/redis.html" target="_blank">Redis</a> (sink)</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/kafka.html" target="_blank">Apache Kafka</a> (sink/source)</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/elasticsearch.html" target="_blank">Elasticsearch</a> (sink)</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/elasticsearch2.html" target="_blank">Elasticsearch 2.x</a> (sink)</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/filesystem_sink.html" target="_blank">HDFS</a> (sink)</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/rabbitmq.html" target="_blank">RabbitMQ</a> (sink/source)</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/kinesis.html" target="_blank">Amazon Kinesis Streams</a> (sink/source)</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/twitter.html" target="_blank">Twitter</a> (source)</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/nifi.html" target="_blank">Apache NiFi</a> (sink/source)</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/cassandra.html" target="_blank">Apache Cassandra</a> (sink)</li>
+  <li><a href="https://github.com/apache/bahir-flink" target="_blank">Redis, Flume, and ActiveMQ (via Apache Bahir)</a> (sink)</li>
 </ul>
 
 <p>To run an application using one of these connectors, additional third party
@@ -188,7 +188,7 @@ Please let us know on the <a href="#mailing-lists">user/dev mailing list</a>.</p
 
 <p><strong>Apache Zeppelin</strong></p>
 
-<p><a href="https://zeppelin.incubator.apache.org/">Apache Zeppelin (incubator)</a> is a web-based notebook that enables interactive data analytics and can be used with
+<p><a href="https://zeppelin.incubator.apache.org/">Apache Zeppelin</a> is a web-based notebook that enables interactive data analytics and can be used with
 <a href="https://zeppelin.incubator.apache.org/docs/interpreter/flink.html">Flink as an execution engine</a> (next to others engines).
 See also Jim Dowling\u2019s <a href="http://www.slideshare.net/FlinkForward/jim-dowling-interactive-flink-analytics-with-hopsworks-and-zeppelin">Flink Forward talk</a> about Zeppelin on Flink.</p>
 

http://git-wip-us.apache.org/repos/asf/flink-web/blob/5926684a/content/introduction.html
----------------------------------------------------------------------
diff --git a/content/introduction.html b/content/introduction.html
index cdb82d2..925e9db 100644
--- a/content/introduction.html
+++ b/content/introduction.html
@@ -143,7 +143,7 @@
     <h1>Introduction to Apache Flink�</h1>
 
 	<p><br />
-Below is a high-level overview of Apache Flink and stream processing. For a more technical introduction, we recommend the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/concepts/concepts.html" target="_blank">\u201cConcepts\u201d page</a> in the Flink documentation.
+Below is a high-level overview of Apache Flink and stream processing. For a more technical introduction, we recommend the <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/concepts/programming-model.html" target="_blank">\u201cConcepts\u201d page</a> in the Flink documentation.
 <br /></p>
 <div class="page-toc">
 <ul id="markdown-toc">
@@ -274,14 +274,14 @@ Below is a high-level overview of Apache Flink and stream processing. For a more
 <h3 id="apis">APIs</h3>
 
 <ul>
-  <li>Flink\u2019s <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/index.html" target="_blank">DataStream API</a> is for programs that implement transformations on data streams (e.g., filtering, updating state, defining windows, aggregating).</li>
-  <li>The <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/batch/index.html" target="_blank">DataSet API</a> is for programs that implement transformations on data sets (e.g., filtering, mapping, joining, grouping).</li>
-  <li>The <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/table.html" target="_blank">Table API</a> is a SQL-like expression language for relational stream and batch processing that can be easily embedded in Flink\u2019s DataSet and DataStream APIs (Java and Scala).</li>
-  <li><a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/table.html#sql" target="_blank">Streaming SQL</a> enables SQL queries to be executed on streaming and batch tables. The syntax is based on <a href="https://calcite.apache.org/docs/stream.html" target="_blank">Apache Calcite\u2122</a>.</li>
+  <li>Flink\u2019s <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/datastream_api.html" target="_blank">DataStream API</a> is for programs that implement transformations on data streams (e.g., filtering, updating state, defining windows, aggregating).</li>
+  <li>The <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/batch/index.html" target="_blank">DataSet API</a> is for programs that implement transformations on data sets (e.g., filtering, mapping, joining, grouping).</li>
+  <li>The <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/table_api.html#table-api" target="_blank">Table API</a> is a SQL-like expression language for relational stream and batch processing that can be easily embedded in Flink\u2019s DataSet and DataStream APIs (Java and Scala).</li>
+  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/table_api.html#sql" target="_blank">Streaming SQL</a> enables SQL queries to be executed on streaming and batch tables. The syntax is based on <a href="https://calcite.apache.org/docs/stream.html" target="_blank">Apache Calcite\u2122</a>.</li>
 </ul>
 
 <h3 id="libraries">Libraries</h3>
-<p>Flink also includes special-purpose libraries for <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/streaming/libs/cep.html" target="_blank">complex event processing</a>, <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/batch/libs/ml/index.html" target="_blank">machine learning</a>, <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/apis/batch/libs/gelly.html" target="_blank">graph processing</a>, and <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/storm_compatibility.html" target="_blank">Apache Storm compatibility</a>.</p>
+<p>Flink also includes special-purpose libraries for <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/cep.html" target="_blank">complex event processing</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/ml/index.html" target="_blank">machine learning</a>, <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/gelly/index.html" target="_blank">graph processing</a>, and <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/libs/storm_compatibility.html" target="_blank">Apache Storm compatibility</a>.</p>
 
 <h2 id="flink-and-other-frameworks">Flink and other frameworks</h2>
 
@@ -301,7 +301,7 @@ Below is a high-level overview of Apache Flink and stream processing. For a more
 
 <h2 id="key-takeaways-and-next-steps">Key Takeaways and Next Steps</h2>
 
-<p>In summary, Apache Flink is an open-source stream processing framework that eliminates the \u201cperformance vs. reliability\u201d tradeoff often associated with open-source streaming engines and performs consistently in both categories. Following this introduction, we recommend you try our <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/quickstart/setup_quickstart.html" target="_blank">quickstart</a>, <a href="/downloads.html">download</a> the most recent stable version of Flink, or review the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.1/" target="_blank">documentation</a>.</p>
+<p>In summary, Apache Flink is an open-source stream processing framework that eliminates the \u201cperformance vs. reliability\u201d tradeoff often associated with open-source streaming engines and performs consistently in both categories. Following this introduction, we recommend you try our <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/quickstart/setup_quickstart.html" target="_blank">quickstart</a>, <a href="/downloads.html">download</a> the most recent stable version of Flink, or review the <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.2/index.html" target="_blank">documentation</a>.</p>
 
 <p>And we encourage you to join the Flink user mailing list and to share your questions with the community. We\u2019re here to help you get the most out of Flink.</p>