You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by al...@apache.org on 2017/12/12 11:40:22 UTC

[6/6] flink-web git commit: Rebuild website

Rebuild website


Project: http://git-wip-us.apache.org/repos/asf/flink-web/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink-web/commit/dd8fd7e1
Tree: http://git-wip-us.apache.org/repos/asf/flink-web/tree/dd8fd7e1
Diff: http://git-wip-us.apache.org/repos/asf/flink-web/diff/dd8fd7e1

Branch: refs/heads/asf-site
Commit: dd8fd7e14e954f2a8e47040f186748a1d2460fe0
Parents: 2bbed93
Author: Aljoscha Krettek <al...@gmail.com>
Authored: Mon Dec 11 16:24:02 2017 +0100
Committer: Aljoscha Krettek <al...@gmail.com>
Committed: Tue Dec 12 12:38:35 2017 +0100

----------------------------------------------------------------------
 content/blog/feed.xml                      | 218 +++++++++++++
 content/blog/index.html                    |  42 ++-
 content/blog/page2/index.html              |  40 ++-
 content/blog/page3/index.html              |  40 ++-
 content/blog/page4/index.html              |  40 ++-
 content/blog/page5/index.html              |  40 ++-
 content/blog/page6/index.html              |  25 ++
 content/index.html                         |  12 +-
 content/news/2017/12/12/release-1.4.0.html | 405 ++++++++++++++++++++++++
 9 files changed, 782 insertions(+), 80 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/flink-web/blob/dd8fd7e1/content/blog/feed.xml
----------------------------------------------------------------------
diff --git a/content/blog/feed.xml b/content/blog/feed.xml
index fbd994d..07bb0c3 100644
--- a/content/blog/feed.xml
+++ b/content/blog/feed.xml
@@ -7,6 +7,224 @@
 <atom:link href="http://flink.apache.org/blog/feed.xml" rel="self" type="application/rss+xml" />
 
 <item>
+<title>Apache Flink 1.4.0 Release Announcement</title>
+<description>&lt;p&gt;The Apache Flink community is pleased to announce the 1.4.0 release. Over the past 5 months, the
+Flink community has been working hard to resolve more than 900 issues. See the &lt;a href=&quot;https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&amp;amp;version=12340533&quot;&gt;complete changelog&lt;/a&gt;
+for more detail.&lt;/p&gt;
+
+&lt;p&gt;This is the fifth major release in the 1.x.y series. It is API-compatible with the other 1.x.y
+releases for APIs annotated with the @Public annotation.&lt;/p&gt;
+
+&lt;p&gt;We encourage everyone to download the release and check out the &lt;a href=&quot;https://ci.apache.org/projects/flink/flink-docs-release-1.4/&quot;&gt;documentation&lt;/a&gt;.&lt;/p&gt;
+
+&lt;p&gt;Feedback through the &lt;a href=&quot;http://flink.apache.org/community.html#mailing-lists&quot;&gt;Flink mailing lists&lt;/a&gt; is, as always, gladly encouraged!&lt;/p&gt;
+
+&lt;p&gt;You can find the binaries on the updated &lt;a href=&quot;http://flink.apache.org/downloads.html&quot;&gt;Downloads&lt;/a&gt; page on the Flink project site.&lt;/p&gt;
+
+&lt;p&gt;The release includes improvements to many different aspects of Flink, including:&lt;/p&gt;
+
+&lt;ul&gt;
+  &lt;li&gt;The ability to build end-to-end exactly-once applications with Flink and popular data sources and sinks such as Apache Kafka.&lt;/li&gt;
+  &lt;li&gt;A more developer-friendly dependency structure as well as Hadoop-free Flink for Flink users who do not have Hadoop dependencies.&lt;/li&gt;
+  &lt;li&gt;Support for JOIN and for new sources and sinks in table API and SQL, expanding the range of logic that can be expressed with these APIs.&lt;/li&gt;
+&lt;/ul&gt;
+
+&lt;p&gt;A summary of some of the features in the release is available below.&lt;/p&gt;
+
+&lt;p&gt;For more background on the Flink 1.4.0 release and the work planned for the Flink 1.5.0 release, please refer to &lt;a href=&quot;http://flink.apache.org/news/2017/11/22/release-1.4-and-1.5-timeline.html&quot;&gt;this blog post&lt;/a&gt; on the Apache Flink blog.&lt;/p&gt;
+
+&lt;h2 id=&quot;new-features-and-improvements&quot;&gt;New Features and Improvements&lt;/h2&gt;
+
+&lt;h3 id=&quot;end-to-end-exactly-once-applications-with-apache-flink-and-apache-kafka-and-twophasecommitsinkfunction&quot;&gt;End-to-end Exactly Once Applications with Apache Flink and Apache Kafka and TwoPhaseCommitSinkFunction&lt;/h3&gt;
+
+&lt;p&gt;Flink 1.4 includes a first version of an exactly-once producer for Apache Kafka 0.11. This producer
+enables developers who build Flink applications with Kafka as a data source and sink to compute
+exactly-once results not just within the Flink program, but truly “end-to-end” in the application.&lt;/p&gt;
+
+&lt;p&gt;The common pattern used for exactly-once applications in Kafka and in other sinks–the two-phase
+commit algorithm–has been extracted in Flink 1.4.0 into a common class, the
+TwoPhaseCommitSinkFunction (&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-7210&quot;&gt;FLINK-7210&lt;/a&gt;). This
+will make it easier for users to create their own exactly-once data sinks in the future.&lt;/p&gt;
+
+&lt;h3 id=&quot;table-api-and-streaming-sql-enhancements&quot;&gt;Table API and Streaming SQL Enhancements&lt;/h3&gt;
+
+&lt;p&gt;Flink SQL now supports windowed joins based on processing time and event time
+(&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-5725&quot;&gt;FLINK-5725&lt;/a&gt;). Users will be able to execute a
+join between 2 streaming tables and compute windowed results according to these 2 different concepts
+of time. The syntax and semantics in Flink are the same as standard SQL with JOIN and with Flink’s
+streaming SQL more broadly.&lt;/p&gt;
+
+&lt;p&gt;Flink SQL also now supports “INSERT INTO SELECT” queries, which makes it possible to write results
+from SQL directly into a data sink (an external system that receives data from a Flink application).
+This improves operability and ease-of-use of Flink SQL.&lt;/p&gt;
+
+&lt;p&gt;The Table API now supports aggregations on streaming tables; previously, the only supported
+operations on streaming tables were projection, selection, and union
+(&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-4557&quot;&gt;FLINK-4557&lt;/a&gt;). This feature was initially discussed in Flink
+Improvement Proposal 11: &lt;a href=&quot;https://cwiki.apache.org/confluence/display/FLINK/FLIP-11%3A+Table+API+Stream+Aggregations&quot;&gt;FLIP-11&lt;/a&gt;.&lt;/p&gt;
+
+&lt;p&gt;The release also adds support for new table API and SQL sources and sinks, including a Kafka 0.11
+source and JDBC sink.&lt;/p&gt;
+
+&lt;p&gt;Lastly, Flink SQL now uses Apache Calcite 1.14, which was just released in October 2017
+(&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-7051&quot;&gt;FLINK-7051&lt;/a&gt;).&lt;/p&gt;
+
+&lt;h3 id=&quot;a-significantly-improved-dependency-structure-and-reversed-class-loading&quot;&gt;A Significantly-Improved Dependency Structure and Reversed Class Loading&lt;/h3&gt;
+
+&lt;p&gt;Flink 1.4.0 shades a number of dependences and subtle runtime conflicts, including:&lt;/p&gt;
+
+&lt;ul&gt;
+  &lt;li&gt;ASM&lt;/li&gt;
+  &lt;li&gt;Guava&lt;/li&gt;
+  &lt;li&gt;Jackson&lt;/li&gt;
+  &lt;li&gt;Netty&lt;/li&gt;
+  &lt;li&gt;Apache Zookeeper&lt;/li&gt;
+&lt;/ul&gt;
+
+&lt;p&gt;These changes improve Flink’s overall stability and removes friction when embedding Flink or calling
+Flink “library style”.&lt;/p&gt;
+
+&lt;p&gt;The release also introduces default reversed (child-first) class loading for dynamically-loaded user
+code, allowing for different dependencies than those included in the core framework.&lt;/p&gt;
+
+&lt;p&gt;For details on those changes please check out the relevant Jira issues:&lt;/p&gt;
+
+&lt;ul&gt;
+  &lt;li&gt;&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-7442&quot;&gt;FLINK-7442&lt;/a&gt;&lt;/li&gt;
+  &lt;li&gt;&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-6529&quot;&gt;FLINK-6529&lt;/a&gt;&lt;/li&gt;
+&lt;/ul&gt;
+
+&lt;h3 id=&quot;hadoop-free-flink&quot;&gt;Hadoop-free Flink&lt;/h3&gt;
+
+&lt;p&gt;Apache Flink users without any Apache Hadoop dependencies can now run Flink without Hadoop. Flink
+programs that do not rely on Hadoop components can now be much smaller, a benefit particularly in a
+container-based setup resulting in less network traffic and better performance.&lt;/p&gt;
+
+&lt;p&gt;This includes the addition of Flink’s own Amazon S3 filesystem implementations based on Hadoop’s S3a
+and Presto’s S3 file system with properly shaded dependencies (&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-5706&quot;&gt;FLINK-5706&lt;/a&gt;).&lt;/p&gt;
+
+&lt;p&gt;The details of these changes regarding Hadoop-free Flink are available in the Jira issue:
+&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-2268&quot;&gt;FLINK-2268&lt;/a&gt;.&lt;/p&gt;
+
+&lt;h3 id=&quot;improvements-to-flink-internals&quot;&gt;Improvements to Flink Internals&lt;/h3&gt;
+
+&lt;p&gt;Flink 1.4.0 introduces a new blob storage architecture that was first discussed in
+&lt;a href=&quot;https://cwiki.apache.org/confluence/display/FLINK/FLIP-19%3A+Improved+BLOB+storage+architecture&quot;&gt;Flink Improvement Proposal 19&lt;/a&gt; (&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-6916&quot;&gt;FLINK-6916&lt;/a&gt;).&lt;/p&gt;
+
+&lt;p&gt;This will enable easier integration with both the work being done in &lt;a href=&quot;https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=65147077&quot;&gt;Flink Improvement Proposal 6&lt;/a&gt; in
+the future and with other improvements in the 1.4.0 release, such as support for messages larger
+than the maximum Akka Framesize (&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-6046&quot;&gt;FLINK-6046&lt;/a&gt;).&lt;/p&gt;
+
+&lt;p&gt;The improvement also enables Flink to leverage distributed file systems in high availability
+settings for optimized distribution of deployment data to TaskManagers.&lt;/p&gt;
+
+&lt;h3 id=&quot;improvements-to-the-queryable-state-client&quot;&gt;Improvements to the Queryable State Client&lt;/h3&gt;
+
+&lt;p&gt;Flink’s &lt;a href=&quot;https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/stream/state/queryable_state.html&quot;&gt;queryable state&lt;/a&gt; makes it possible for users to access application state directly in Flink
+before the state has been sent to an external database or key-value store.&lt;/p&gt;
+
+&lt;p&gt;Flink 1.4.0 introduces a range of improvements to the queryable state client, including a more
+container-friendly architecture, a more user-friendly API that hides configuration parameters, and
+the groundwork to be able to expose window state (the state of an in-flight window) in the future.&lt;/p&gt;
+
+&lt;p&gt;For details about the changes to queryable state please refer to the umbrella Jira issue:
+&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-5675&quot;&gt;FLINK-5675&lt;/a&gt;.&lt;/p&gt;
+
+&lt;h3 id=&quot;metrics-and-monitoring&quot;&gt;Metrics and Monitoring&lt;/h3&gt;
+
+&lt;p&gt;Flink’s metrics system now also includes support for Prometheus, an increasingly-popular metrics and
+reporting system within the Flink community (&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-6221&quot;&gt;FLINK-6221&lt;/a&gt;).&lt;/p&gt;
+
+&lt;p&gt;And the Apache Kafka connector in Flink now exposes metrics for failed and successful offset commits
+in the Kafka consumer callback (&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-6998&quot;&gt;FLINK-6998&lt;/a&gt;).&lt;/p&gt;
+
+&lt;h3 id=&quot;connector-improvements-and-fixes&quot;&gt;Connector improvements and fixes&lt;/h3&gt;
+
+&lt;p&gt;Flink 1.4.0 introduces an Apache Kafka 0.11 connector and, as described above, support for an
+exactly-once producer for Kafka 0.11 (&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-6988&quot;&gt;FLINK-6988&lt;/a&gt;).&lt;/p&gt;
+
+&lt;p&gt;Additionally, the Flink-Kafka consumer now supports dynamic partition discovery &amp;amp; topic discovery
+based on regex. This means that the Flink-Kafka consumer can pick up new Kafka partitions without
+needing to restart the job and while maintaining exactly-once guarantees
+(&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-4022&quot;&gt;FLINK-4022&lt;/a&gt;).&lt;/p&gt;
+
+&lt;p&gt;Flink’s Apache Kinesis connector now uses an updated version of the Kinesis Consumer Library and
+Kinesis Consumer Library. This introduces improved retry logic to the connector and should
+significantly reduce the number of failures caused by Flink writing too quickly to Kinesis
+(&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-7366&quot;&gt;FLINK-7366&lt;/a&gt;).&lt;/p&gt;
+
+&lt;p&gt;Flink’s Apache Cassandra connector now supports Scala tuples–previously, only streams of Java
+tuples were supported (&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-4497&quot;&gt;FLINK-4497&lt;/a&gt;). Also, a bug was fixed in
+the Cassandra connector that caused messages to be lost in certain instances
+(&lt;a href=&quot;https://issues.apache.org/jira/browse/FLINK-4500&quot;&gt;FLINK-4500&lt;/a&gt;).&lt;/p&gt;
+
+&lt;h2 id=&quot;release-notes---please-read&quot;&gt;Release Notes - Please Read&lt;/h2&gt;
+
+&lt;p&gt;Some of these changes will require updating the configuration or Maven dependencies for existing
+programs. Please read below to see if you might be affected.&lt;/p&gt;
+
+&lt;h3 id=&quot;changes-to-dynamic-class-loading-of-user-code&quot;&gt;Changes to dynamic class loading of user code&lt;/h3&gt;
+
+&lt;p&gt;As mentioned above, we changed the way Flink loads user code from the previous default of
+&lt;em&gt;parent-first class loading&lt;/em&gt; (the default for Java) to &lt;em&gt;child-first classloading&lt;/em&gt;, which is a common
+practice in Java Application Servers, where this is also referred to as inverted or reversed class
+loading.&lt;/p&gt;
+
+&lt;p&gt;This should not affect regular user code but will enable programs to use a different version of
+dependencies that come with Flink – for example Akka, netty, or Jackson. If you want to change back
+to the previous default, you can use the configuration setting &lt;code&gt;classloader.resolve-order: parent-first&lt;/code&gt;,
+the new default being &lt;code&gt;child-first&lt;/code&gt;.&lt;/p&gt;
+
+&lt;h3 id=&quot;no-more-avro-dependency-included-by-default&quot;&gt;No more Avro dependency included by default&lt;/h3&gt;
+
+&lt;p&gt;Flink previously included Avro by default so user programs could simply use Avro and not worry about
+adding any dependencies. This behavior was changed in Flink 1.4 because it can lead to dependency
+clashes.&lt;/p&gt;
+
+&lt;p&gt;You now must manually include the Avro dependency (&lt;code&gt;flink-avro&lt;/code&gt;) with your program jar (or add it to
+the Flink lib folder) if you want to use Avro.&lt;/p&gt;
+
+&lt;h3 id=&quot;hadoop-free-flink-1&quot;&gt;Hadoop-free Flink&lt;/h3&gt;
+
+&lt;p&gt;Starting with version 1.4, Flink can run without any Hadoop dependencies present in the Classpath.
+Along with simply running without Hadoop, this enables Flink to dynamically use whatever Hadoop
+version is available in the classpath.&lt;/p&gt;
+
+&lt;p&gt;You could, for example, download the Hadoop-free release of Flink but use that to run on any
+supported version of YARN, and Flink would dynamically use the Hadoop dependencies from YARN.&lt;/p&gt;
+
+&lt;p&gt;This also means that in cases where you used connectors to HDFS, such as the &lt;code&gt;BucketingSink&lt;/code&gt; or
+&lt;code&gt;RollingSink&lt;/code&gt;, you now have to ensure that you either use a Flink distribution with bundled Hadoop
+dependencies or make sure to include Hadoop dependencies when building a jar file for your
+application.&lt;/p&gt;
+
+&lt;h2 id=&quot;list-of-contributors&quot;&gt;List of Contributors&lt;/h2&gt;
+
+&lt;p&gt;According to git shortlog, the following 106 people contributed to the 1.4.0 release. Thank you to
+all contributors!&lt;/p&gt;
+
+&lt;p&gt;Ajay Tripathy, Alejandro Alcalde, Aljoscha Krettek, Bang, Phiradet, Bowen Li, Chris Ward, Cristian,
+Dan Kelley, David Anderson, Dawid Wysakowicz, Dian Fu, Dmitrii Kniazev, DmytroShkvyra, Fabian
+Hueske, FlorianFan, Fokko Driesprong, Gabor Gevay, Gary Yao, Greg Hogan, Haohui Mai, Hequn Cheng,
+James Lafa, Jark Wu, Jie Shen, Jing Fan, JingsongLi, Joerg Schad, Juan Paulo Gutierrez, Ken Geis,
+Kent Murra, Kurt Young, Lim Chee Hau, Maximilian Bode, Michael Fong, Mike Kobit, Mikhail Lipkovich,
+Nico Kruber, Novotnik, Petr, Nycholas de Oliveira e Oliveira, Patrick Lucas, Piotr Nowojski, Robert
+Metzger, Rodrigo Bonifacio, Rong Rong, Scott Kidder, Sebastian Klemke, Shuyi Chen, Stefan Richter,
+Stephan Ewen, Svend Vanderveken, Till Rohrmann, Tony Wei, Tzu-Li (Gordon) Tai, Ufuk Celebi, Usman
+Younas, Vetriselvan1187, Vishnu Viswanath, Wright, Eron, Xingcan Cui, Xpray, Yestin, Yonatan Most,
+Zhenzhong Xu, Zhijiang, adebski, asdf2014, bbayani, biao.liub, cactuslrd.lird, dawidwys, desktop,
+fengyelei, godfreyhe, gosubpl, gyao, hongyuhong, huafengw, kkloudas, kl0u, lincoln-lil,
+lingjinjiang, mengji.fy, minwenjun, mtunique, p1tz, paul, rtudoran, shaoxuan-wang, sirko
+bretschneider, sunjincheng121, tedyu, twalthr, uybhatti, wangmiao1981, yew1eb, z00376786, zentol,
+zhangminglei, zhe li, zhouhai02, zjureel, 付典, 军长, 宝牛, 淘江, 金竹&lt;/p&gt;
+
+</description>
+<pubDate>Tue, 12 Dec 2017 11:00:00 +0100</pubDate>
+<link>http://flink.apache.org/news/2017/12/12/release-1.4.0.html</link>
+<guid isPermaLink="true">/news/2017/12/12/release-1.4.0.html</guid>
+</item>
+
+<item>
 <title>Looking Ahead to Apache Flink 1.4.0 and 1.5.0</title>
 <description>&lt;p&gt;The Apache Flink 1.4.0 release is on track to happen in the next couple of weeks, and for all of the
 readers out there who haven’t been following the release discussion on &lt;a href=&quot;http://flink.apache.org/community.html#mailing-lists&quot;&gt;Flink’s developer mailing

http://git-wip-us.apache.org/repos/asf/flink-web/blob/dd8fd7e1/content/blog/index.html
----------------------------------------------------------------------
diff --git a/content/blog/index.html b/content/blog/index.html
index cecb183..1862ae1 100644
--- a/content/blog/index.html
+++ b/content/blog/index.html
@@ -142,6 +142,23 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a href="/news/2017/12/12/release-1.4.0.html">Apache Flink 1.4.0 Release Announcement</a></h2>
+
+      <p>12 Dec 2017
+       Aljoscha Krettek (<a href="https://twitter.com/aljoscha">@aljoscha</a>) &amp; Mike Winters (<a href="https://twitter.com/wints">@wints</a>)</p>
+
+      <p><p>The Apache Flink community is pleased to announce the 1.4.0 release. Over the past 5 months, the
+Flink community has been working hard to resolve more than 900 issues. See the <a href="https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&amp;version=12340533">complete changelog</a>
+for more detail.</p>
+
+</p>
+
+      <p><a href="/news/2017/12/12/release-1.4.0.html">Continue reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a href="/news/2017/11/22/release-1.4-and-1.5-timeline.html">Looking Ahead to Apache Flink 1.4.0 and 1.5.0</a></h2>
 
       <p>22 Nov 2017
@@ -274,21 +291,6 @@ what’s coming in Flink 1.4.0 as well as a preview of what the Flink community
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a href="/news/2017/03/23/release-1.1.5.html">Apache Flink 1.1.5 Released</a></h2>
-
-      <p>23 Mar 2017
-      </p>
-
-      <p><p>The Apache Flink community released the next bugfix version of the Apache Flink 1.1 series.</p>
-
-</p>
-
-      <p><a href="/news/2017/03/23/release-1.1.5.html">Continue reading &raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -321,6 +323,16 @@ what’s coming in Flink 1.4.0 as well as a preview of what the Flink community
 
     <ul id="markdown-toc">
       
+      <li><a href="/news/2017/12/12/release-1.4.0.html">Apache Flink 1.4.0 Release Announcement</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2017/11/22/release-1.4-and-1.5-timeline.html">Looking Ahead to Apache Flink 1.4.0 and 1.5.0</a></li>
 
       

http://git-wip-us.apache.org/repos/asf/flink-web/blob/dd8fd7e1/content/blog/page2/index.html
----------------------------------------------------------------------
diff --git a/content/blog/page2/index.html b/content/blog/page2/index.html
index 598b607..2a57c9d 100644
--- a/content/blog/page2/index.html
+++ b/content/blog/page2/index.html
@@ -142,6 +142,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a href="/news/2017/03/23/release-1.1.5.html">Apache Flink 1.1.5 Released</a></h2>
+
+      <p>23 Mar 2017
+      </p>
+
+      <p><p>The Apache Flink community released the next bugfix version of the Apache Flink 1.1 series.</p>
+
+</p>
+
+      <p><a href="/news/2017/03/23/release-1.1.5.html">Continue reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a href="/news/2017/02/06/release-1.2.0.html">Announcing Apache Flink 1.2.0</a></h2>
 
       <p>06 Feb 2017 by Robert Metzger
@@ -271,21 +286,6 @@
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a href="/news/2016/05/11/release-1.0.3.html">Flink 1.0.3 Released</a></h2>
-
-      <p>11 May 2016
-      </p>
-
-      <p><p>Today, the Flink community released Flink version <strong>1.0.3</strong>, the third bugfix release of the 1.0 series.</p>
-
-</p>
-
-      <p><a href="/news/2016/05/11/release-1.0.3.html">Continue reading &raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -318,6 +318,16 @@
 
     <ul id="markdown-toc">
       
+      <li><a href="/news/2017/12/12/release-1.4.0.html">Apache Flink 1.4.0 Release Announcement</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2017/11/22/release-1.4-and-1.5-timeline.html">Looking Ahead to Apache Flink 1.4.0 and 1.5.0</a></li>
 
       

http://git-wip-us.apache.org/repos/asf/flink-web/blob/dd8fd7e1/content/blog/page3/index.html
----------------------------------------------------------------------
diff --git a/content/blog/page3/index.html b/content/blog/page3/index.html
index 0594a1f..1531870 100644
--- a/content/blog/page3/index.html
+++ b/content/blog/page3/index.html
@@ -142,6 +142,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a href="/news/2016/05/11/release-1.0.3.html">Flink 1.0.3 Released</a></h2>
+
+      <p>11 May 2016
+      </p>
+
+      <p><p>Today, the Flink community released Flink version <strong>1.0.3</strong>, the third bugfix release of the 1.0 series.</p>
+
+</p>
+
+      <p><a href="/news/2016/05/11/release-1.0.3.html">Continue reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a href="/news/2016/04/22/release-1.0.2.html">Flink 1.0.2 Released</a></h2>
 
       <p>22 Apr 2016
@@ -269,21 +284,6 @@
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a href="/news/2015/11/27/release-0.10.1.html">Flink 0.10.1 released</a></h2>
-
-      <p>27 Nov 2015
-      </p>
-
-      <p><p>Today, the Flink community released the first bugfix release of the 0.10 series of Flink.</p>
-
-</p>
-
-      <p><a href="/news/2015/11/27/release-0.10.1.html">Continue reading &raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -316,6 +316,16 @@
 
     <ul id="markdown-toc">
       
+      <li><a href="/news/2017/12/12/release-1.4.0.html">Apache Flink 1.4.0 Release Announcement</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2017/11/22/release-1.4-and-1.5-timeline.html">Looking Ahead to Apache Flink 1.4.0 and 1.5.0</a></li>
 
       

http://git-wip-us.apache.org/repos/asf/flink-web/blob/dd8fd7e1/content/blog/page4/index.html
----------------------------------------------------------------------
diff --git a/content/blog/page4/index.html b/content/blog/page4/index.html
index 29b340e..75422de 100644
--- a/content/blog/page4/index.html
+++ b/content/blog/page4/index.html
@@ -142,6 +142,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a href="/news/2015/11/27/release-0.10.1.html">Flink 0.10.1 released</a></h2>
+
+      <p>27 Nov 2015
+      </p>
+
+      <p><p>Today, the Flink community released the first bugfix release of the 0.10 series of Flink.</p>
+
+</p>
+
+      <p><a href="/news/2015/11/27/release-0.10.1.html">Continue reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a href="/news/2015/11/16/release-0.10.0.html">Announcing Apache Flink 0.10.0</a></h2>
 
       <p>16 Nov 2015
@@ -284,21 +299,6 @@ release is a preview release that contains known issues.</p>
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a href="/news/2015/04/07/march-in-flink.html">March 2015 in the Flink community</a></h2>
-
-      <p>07 Apr 2015
-      </p>
-
-      <p><p>March has been a busy month in the Flink community.</p>
-
-</p>
-
-      <p><a href="/news/2015/04/07/march-in-flink.html">Continue reading &raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -331,6 +331,16 @@ release is a preview release that contains known issues.</p>
 
     <ul id="markdown-toc">
       
+      <li><a href="/news/2017/12/12/release-1.4.0.html">Apache Flink 1.4.0 Release Announcement</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2017/11/22/release-1.4-and-1.5-timeline.html">Looking Ahead to Apache Flink 1.4.0 and 1.5.0</a></li>
 
       

http://git-wip-us.apache.org/repos/asf/flink-web/blob/dd8fd7e1/content/blog/page5/index.html
----------------------------------------------------------------------
diff --git a/content/blog/page5/index.html b/content/blog/page5/index.html
index c240d99..39d9872 100644
--- a/content/blog/page5/index.html
+++ b/content/blog/page5/index.html
@@ -142,6 +142,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a href="/news/2015/04/07/march-in-flink.html">March 2015 in the Flink community</a></h2>
+
+      <p>07 Apr 2015
+      </p>
+
+      <p><p>March has been a busy month in the Flink community.</p>
+
+</p>
+
+      <p><a href="/news/2015/04/07/march-in-flink.html">Continue reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a href="/news/2015/03/13/peeking-into-Apache-Flinks-Engine-Room.html">Peeking into Apache Flink's Engine Room</a></h2>
 
       <p>13 Mar 2015 by Fabian Hüske (<a href="https://twitter.com/">@fhueske</a>)
@@ -281,21 +296,6 @@ and offers a new API including definition of flexible windows.</p>
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a href="/news/2014/09/26/release-0.6.1.html">Apache Flink 0.6.1 available</a></h2>
-
-      <p>26 Sep 2014
-      </p>
-
-      <p><p>We are happy to announce the availability of Flink 0.6.1.</p>
-
-</p>
-
-      <p><a href="/news/2014/09/26/release-0.6.1.html">Continue reading &raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -328,6 +328,16 @@ and offers a new API including definition of flexible windows.</p>
 
     <ul id="markdown-toc">
       
+      <li><a href="/news/2017/12/12/release-1.4.0.html">Apache Flink 1.4.0 Release Announcement</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2017/11/22/release-1.4-and-1.5-timeline.html">Looking Ahead to Apache Flink 1.4.0 and 1.5.0</a></li>
 
       

http://git-wip-us.apache.org/repos/asf/flink-web/blob/dd8fd7e1/content/blog/page6/index.html
----------------------------------------------------------------------
diff --git a/content/blog/page6/index.html b/content/blog/page6/index.html
index f40addc..e5a1288 100644
--- a/content/blog/page6/index.html
+++ b/content/blog/page6/index.html
@@ -142,6 +142,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a href="/news/2014/09/26/release-0.6.1.html">Apache Flink 0.6.1 available</a></h2>
+
+      <p>26 Sep 2014
+      </p>
+
+      <p><p>We are happy to announce the availability of Flink 0.6.1.</p>
+
+</p>
+
+      <p><a href="/news/2014/09/26/release-0.6.1.html">Continue reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a href="/news/2014/08/26/release-0.6.html">Apache Flink 0.6 available</a></h2>
 
       <p>26 Aug 2014
@@ -191,6 +206,16 @@ academic and open source project that Flink originates from.</p>
 
     <ul id="markdown-toc">
       
+      <li><a href="/news/2017/12/12/release-1.4.0.html">Apache Flink 1.4.0 Release Announcement</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2017/11/22/release-1.4-and-1.5-timeline.html">Looking Ahead to Apache Flink 1.4.0 and 1.5.0</a></li>
 
       

http://git-wip-us.apache.org/repos/asf/flink-web/blob/dd8fd7e1/content/index.html
----------------------------------------------------------------------
diff --git a/content/index.html b/content/index.html
index b152f36..c71ccc2 100644
--- a/content/index.html
+++ b/content/index.html
@@ -168,6 +168,13 @@
 
   <dl>
       
+        <dt> <a href="/news/2017/12/12/release-1.4.0.html">Apache Flink 1.4.0 Release Announcement</a></dt>
+        <dd><p>The Apache Flink community is pleased to announce the 1.4.0 release. Over the past 5 months, the
+Flink community has been working hard to resolve more than 900 issues. See the <a href="https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&amp;version=12340533">complete changelog</a>
+for more detail.</p>
+
+</dd>
+      
         <dt> <a href="/news/2017/11/22/release-1.4-and-1.5-timeline.html">Looking Ahead to Apache Flink 1.4.0 and 1.5.0</a></dt>
         <dd><p>The Apache Flink 1.4.0 release is on track to happen in the next couple of weeks, and for all of the
 readers out there who haven’t been following the release discussion on <a href="http://flink.apache.org/community.html#mailing-lists">Flink’s developer mailing
@@ -188,11 +195,6 @@ what’s coming in Flink 1.4.0 as well as a preview of what the Flink community
         <dd><p>The Apache Flink community released the first bugfix version of the Apache Flink 1.3 series.</p>
 
 </dd>
-      
-        <dt> <a href="/news/2017/06/01/release-1.3.0.html">Apache Flink 1.3.0 Release Announcement</a></dt>
-        <dd><p>The Apache Flink community is pleased to announce the 1.3.0 release. Over the past 4 months, the Flink community has been working hard to resolve more than 680 issues. See the <a href="/blog/release_1.3.0-changelog.html">complete changelog</a> for more detail.</p>
-
-</dd>
     
   </dl>
 

http://git-wip-us.apache.org/repos/asf/flink-web/blob/dd8fd7e1/content/news/2017/12/12/release-1.4.0.html
----------------------------------------------------------------------
diff --git a/content/news/2017/12/12/release-1.4.0.html b/content/news/2017/12/12/release-1.4.0.html
new file mode 100644
index 0000000..9628054
--- /dev/null
+++ b/content/news/2017/12/12/release-1.4.0.html
@@ -0,0 +1,405 @@
+<!DOCTYPE html>
+<html lang="en">
+  <head>
+    <meta charset="utf-8">
+    <meta http-equiv="X-UA-Compatible" content="IE=edge">
+    <meta name="viewport" content="width=device-width, initial-scale=1">
+    <!-- The above 3 meta tags *must* come first in the head; any other head content must come *after* these tags -->
+    <title>Apache Flink: Apache Flink 1.4.0 Release Announcement</title>
+    <link rel="shortcut icon" href="/favicon.ico" type="image/x-icon">
+    <link rel="icon" href="/favicon.ico" type="image/x-icon">
+
+    <!-- Bootstrap -->
+    <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.4/css/bootstrap.min.css">
+    <link rel="stylesheet" href="/css/flink.css">
+    <link rel="stylesheet" href="/css/syntax.css">
+
+    <!-- Blog RSS feed -->
+    <link href="/blog/feed.xml" rel="alternate" type="application/rss+xml" title="Apache Flink Blog: RSS feed" />
+
+    <!-- jQuery (necessary for Bootstrap's JavaScript plugins) -->
+    <!-- We need to load Jquery in the header for custom google analytics event tracking-->
+    <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.2/jquery.min.js"></script>
+
+    <!-- HTML5 shim and Respond.js for IE8 support of HTML5 elements and media queries -->
+    <!-- WARNING: Respond.js doesn't work if you view the page via file:// -->
+    <!--[if lt IE 9]>
+      <script src="https://oss.maxcdn.com/html5shiv/3.7.2/html5shiv.min.js"></script>
+      <script src="https://oss.maxcdn.com/respond/1.4.2/respond.min.js"></script>
+    <![endif]-->
+  </head>
+  <body>  
+    
+
+    <!-- Main content. -->
+    <div class="container">
+    <div class="row">
+
+      
+     <div id="sidebar" class="col-sm-3">
+          <!-- Top navbar. -->
+    <nav class="navbar navbar-default">
+        <!-- The logo. -->
+        <div class="navbar-header">
+          <button type="button" class="navbar-toggle collapsed" data-toggle="collapse" data-target="#bs-example-navbar-collapse-1">
+            <span class="icon-bar"></span>
+            <span class="icon-bar"></span>
+            <span class="icon-bar"></span>
+          </button>
+          <div class="navbar-logo">
+            <a href="/">
+              <img alt="Apache Flink" src="/img/flink-header-logo.svg" width="147px" height="73px">
+            </a>
+          </div>
+        </div><!-- /.navbar-header -->
+
+        <!-- The navigation links. -->
+        <div class="collapse navbar-collapse" id="bs-example-navbar-collapse-1">
+          <ul class="nav navbar-nav navbar-main">
+
+            <!-- Downloads -->
+            <li class=""><a class="btn btn-info" href="/downloads.html">Download Flink</a></li>
+
+            <!-- Overview -->
+            <li><a href="/index.html">Home</a></li>
+
+            <!-- Intro -->
+            <li><a href="/introduction.html">Introduction to Flink</a></li>
+
+            <!-- Use cases -->
+            <li><a href="/usecases.html">Flink Use Cases</a></li>
+
+            <!-- Powered by -->
+            <li><a href="/poweredby.html">Powered by Flink</a></li>
+
+            <!-- Ecosystem -->
+            <li><a href="/ecosystem.html">Ecosystem</a></li>
+
+            <!-- Community -->
+            <li><a href="/community.html">Community &amp; Project Info</a></li>
+
+            <!-- Contribute -->
+            <li><a href="/how-to-contribute.html">How to Contribute</a></li>
+
+            <!-- Blog -->
+            <li class=" active hidden-md hidden-sm"><a href="/blog/"><b>Flink Blog</b></a></li>
+
+            <hr />
+
+
+
+            <!-- Documentation -->
+            <!-- <li>
+              <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.4" target="_blank">Documentation <small><span class="glyphicon glyphicon-new-window"></span></small></a>
+            </li> -->
+            <li class="dropdown">
+              <a class="dropdown-toggle" data-toggle="dropdown" href="#">Documentation
+                <span class="caret"></span></a>
+                <ul class="dropdown-menu">
+                  <li><a href="http://ci.apache.org/projects/flink/flink-docs-release-1.4" target="_blank">1.4 (Latest stable release) <small><span class="glyphicon glyphicon-new-window"></span></small></a></li>
+                  <li><a href="http://ci.apache.org/projects/flink/flink-docs-master" target="_blank">1.5 (Snapshot) <small><span class="glyphicon glyphicon-new-window"></span></small></a></li>
+                </ul>
+              </li>
+
+            <!-- Quickstart -->
+            <li>
+              <a href="http://ci.apache.org/projects/flink/flink-docs-release-1.4/quickstart/setup_quickstart.html" target="_blank">Quickstart <small><span class="glyphicon glyphicon-new-window"></span></small></a>
+            </li>
+
+            <!-- GitHub -->
+            <li>
+              <a href="https://github.com/apache/flink" target="_blank">Flink on GitHub <small><span class="glyphicon glyphicon-new-window"></span></small></a>
+            </li>
+
+          </ul>
+
+
+
+          <ul class="nav navbar-nav navbar-bottom">
+          <hr />
+
+            <!-- FAQ -->
+            <li ><a href="/faq.html">Project FAQ</a></li>
+
+            <!-- Twitter -->
+            <li><a href="https://twitter.com/apacheflink" target="_blank">@ApacheFlink <small><span class="glyphicon glyphicon-new-window"></span></small></a></li>
+
+            <!-- Visualizer -->
+            <li class=" hidden-md hidden-sm"><a href="/visualizer/" target="_blank">Plan Visualizer <small><span class="glyphicon glyphicon-new-window"></span></small></a></li>
+
+          </ul>
+        </div><!-- /.navbar-collapse -->
+    </nav>
+
+      </div>
+      <div class="col-sm-9">
+      <div class="row-fluid">
+  <div class="col-sm-12">
+    <div class="row">
+      <h1>Apache Flink 1.4.0 Release Announcement</h1>
+
+      <article>
+        <p>12 Dec 2017 Aljoscha Krettek (<a href="https://twitter.com/aljoscha">@aljoscha</a>) &amp; Mike Winters (<a href="https://twitter.com/wints">@wints</a>)</p>
+
+<p>The Apache Flink community is pleased to announce the 1.4.0 release. Over the past 5 months, the
+Flink community has been working hard to resolve more than 900 issues. See the <a href="https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&amp;version=12340533">complete changelog</a>
+for more detail.</p>
+
+<p>This is the fifth major release in the 1.x.y series. It is API-compatible with the other 1.x.y
+releases for APIs annotated with the @Public annotation.</p>
+
+<p>We encourage everyone to download the release and check out the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.4/">documentation</a>.</p>
+
+<p>Feedback through the <a href="http://flink.apache.org/community.html#mailing-lists">Flink mailing lists</a> is, as always, gladly encouraged!</p>
+
+<p>You can find the binaries on the updated <a href="http://flink.apache.org/downloads.html">Downloads</a> page on the Flink project site.</p>
+
+<p>The release includes improvements to many different aspects of Flink, including:</p>
+
+<ul>
+  <li>The ability to build end-to-end exactly-once applications with Flink and popular data sources and sinks such as Apache Kafka.</li>
+  <li>A more developer-friendly dependency structure as well as Hadoop-free Flink for Flink users who do not have Hadoop dependencies.</li>
+  <li>Support for JOIN and for new sources and sinks in table API and SQL, expanding the range of logic that can be expressed with these APIs.</li>
+</ul>
+
+<p>A summary of some of the features in the release is available below.</p>
+
+<p>For more background on the Flink 1.4.0 release and the work planned for the Flink 1.5.0 release, please refer to <a href="http://flink.apache.org/news/2017/11/22/release-1.4-and-1.5-timeline.html">this blog post</a> on the Apache Flink blog.</p>
+
+<h2 id="new-features-and-improvements">New Features and Improvements</h2>
+
+<h3 id="end-to-end-exactly-once-applications-with-apache-flink-and-apache-kafka-and-twophasecommitsinkfunction">End-to-end Exactly Once Applications with Apache Flink and Apache Kafka and TwoPhaseCommitSinkFunction</h3>
+
+<p>Flink 1.4 includes a first version of an exactly-once producer for Apache Kafka 0.11. This producer
+enables developers who build Flink applications with Kafka as a data source and sink to compute
+exactly-once results not just within the Flink program, but truly “end-to-end” in the application.</p>
+
+<p>The common pattern used for exactly-once applications in Kafka and in other sinks–the two-phase
+commit algorithm–has been extracted in Flink 1.4.0 into a common class, the
+TwoPhaseCommitSinkFunction (<a href="https://issues.apache.org/jira/browse/FLINK-7210">FLINK-7210</a>). This
+will make it easier for users to create their own exactly-once data sinks in the future.</p>
+
+<h3 id="table-api-and-streaming-sql-enhancements">Table API and Streaming SQL Enhancements</h3>
+
+<p>Flink SQL now supports windowed joins based on processing time and event time
+(<a href="https://issues.apache.org/jira/browse/FLINK-5725">FLINK-5725</a>). Users will be able to execute a
+join between 2 streaming tables and compute windowed results according to these 2 different concepts
+of time. The syntax and semantics in Flink are the same as standard SQL with JOIN and with Flink’s
+streaming SQL more broadly.</p>
+
+<p>Flink SQL also now supports “INSERT INTO SELECT” queries, which makes it possible to write results
+from SQL directly into a data sink (an external system that receives data from a Flink application).
+This improves operability and ease-of-use of Flink SQL.</p>
+
+<p>The Table API now supports aggregations on streaming tables; previously, the only supported
+operations on streaming tables were projection, selection, and union
+(<a href="https://issues.apache.org/jira/browse/FLINK-4557">FLINK-4557</a>). This feature was initially discussed in Flink
+Improvement Proposal 11: <a href="https://cwiki.apache.org/confluence/display/FLINK/FLIP-11%3A+Table+API+Stream+Aggregations">FLIP-11</a>.</p>
+
+<p>The release also adds support for new table API and SQL sources and sinks, including a Kafka 0.11
+source and JDBC sink.</p>
+
+<p>Lastly, Flink SQL now uses Apache Calcite 1.14, which was just released in October 2017
+(<a href="https://issues.apache.org/jira/browse/FLINK-7051">FLINK-7051</a>).</p>
+
+<h3 id="a-significantly-improved-dependency-structure-and-reversed-class-loading">A Significantly-Improved Dependency Structure and Reversed Class Loading</h3>
+
+<p>Flink 1.4.0 shades a number of dependences and subtle runtime conflicts, including:</p>
+
+<ul>
+  <li>ASM</li>
+  <li>Guava</li>
+  <li>Jackson</li>
+  <li>Netty</li>
+  <li>Apache Zookeeper</li>
+</ul>
+
+<p>These changes improve Flink’s overall stability and removes friction when embedding Flink or calling
+Flink “library style”.</p>
+
+<p>The release also introduces default reversed (child-first) class loading for dynamically-loaded user
+code, allowing for different dependencies than those included in the core framework.</p>
+
+<p>For details on those changes please check out the relevant Jira issues:</p>
+
+<ul>
+  <li><a href="https://issues.apache.org/jira/browse/FLINK-7442">FLINK-7442</a></li>
+  <li><a href="https://issues.apache.org/jira/browse/FLINK-6529">FLINK-6529</a></li>
+</ul>
+
+<h3 id="hadoop-free-flink">Hadoop-free Flink</h3>
+
+<p>Apache Flink users without any Apache Hadoop dependencies can now run Flink without Hadoop. Flink
+programs that do not rely on Hadoop components can now be much smaller, a benefit particularly in a
+container-based setup resulting in less network traffic and better performance.</p>
+
+<p>This includes the addition of Flink’s own Amazon S3 filesystem implementations based on Hadoop’s S3a
+and Presto’s S3 file system with properly shaded dependencies (<a href="https://issues.apache.org/jira/browse/FLINK-5706">FLINK-5706</a>).</p>
+
+<p>The details of these changes regarding Hadoop-free Flink are available in the Jira issue:
+<a href="https://issues.apache.org/jira/browse/FLINK-2268">FLINK-2268</a>.</p>
+
+<h3 id="improvements-to-flink-internals">Improvements to Flink Internals</h3>
+
+<p>Flink 1.4.0 introduces a new blob storage architecture that was first discussed in
+<a href="https://cwiki.apache.org/confluence/display/FLINK/FLIP-19%3A+Improved+BLOB+storage+architecture">Flink Improvement Proposal 19</a> (<a href="https://issues.apache.org/jira/browse/FLINK-6916">FLINK-6916</a>).</p>
+
+<p>This will enable easier integration with both the work being done in <a href="https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=65147077">Flink Improvement Proposal 6</a> in
+the future and with other improvements in the 1.4.0 release, such as support for messages larger
+than the maximum Akka Framesize (<a href="https://issues.apache.org/jira/browse/FLINK-6046">FLINK-6046</a>).</p>
+
+<p>The improvement also enables Flink to leverage distributed file systems in high availability
+settings for optimized distribution of deployment data to TaskManagers.</p>
+
+<h3 id="improvements-to-the-queryable-state-client">Improvements to the Queryable State Client</h3>
+
+<p>Flink’s <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/stream/state/queryable_state.html">queryable state</a> makes it possible for users to access application state directly in Flink
+before the state has been sent to an external database or key-value store.</p>
+
+<p>Flink 1.4.0 introduces a range of improvements to the queryable state client, including a more
+container-friendly architecture, a more user-friendly API that hides configuration parameters, and
+the groundwork to be able to expose window state (the state of an in-flight window) in the future.</p>
+
+<p>For details about the changes to queryable state please refer to the umbrella Jira issue:
+<a href="https://issues.apache.org/jira/browse/FLINK-5675">FLINK-5675</a>.</p>
+
+<h3 id="metrics-and-monitoring">Metrics and Monitoring</h3>
+
+<p>Flink’s metrics system now also includes support for Prometheus, an increasingly-popular metrics and
+reporting system within the Flink community (<a href="https://issues.apache.org/jira/browse/FLINK-6221">FLINK-6221</a>).</p>
+
+<p>And the Apache Kafka connector in Flink now exposes metrics for failed and successful offset commits
+in the Kafka consumer callback (<a href="https://issues.apache.org/jira/browse/FLINK-6998">FLINK-6998</a>).</p>
+
+<h3 id="connector-improvements-and-fixes">Connector improvements and fixes</h3>
+
+<p>Flink 1.4.0 introduces an Apache Kafka 0.11 connector and, as described above, support for an
+exactly-once producer for Kafka 0.11 (<a href="https://issues.apache.org/jira/browse/FLINK-6988">FLINK-6988</a>).</p>
+
+<p>Additionally, the Flink-Kafka consumer now supports dynamic partition discovery &amp; topic discovery
+based on regex. This means that the Flink-Kafka consumer can pick up new Kafka partitions without
+needing to restart the job and while maintaining exactly-once guarantees
+(<a href="https://issues.apache.org/jira/browse/FLINK-4022">FLINK-4022</a>).</p>
+
+<p>Flink’s Apache Kinesis connector now uses an updated version of the Kinesis Consumer Library and
+Kinesis Consumer Library. This introduces improved retry logic to the connector and should
+significantly reduce the number of failures caused by Flink writing too quickly to Kinesis
+(<a href="https://issues.apache.org/jira/browse/FLINK-7366">FLINK-7366</a>).</p>
+
+<p>Flink’s Apache Cassandra connector now supports Scala tuples–previously, only streams of Java
+tuples were supported (<a href="https://issues.apache.org/jira/browse/FLINK-4497">FLINK-4497</a>). Also, a bug was fixed in
+the Cassandra connector that caused messages to be lost in certain instances
+(<a href="https://issues.apache.org/jira/browse/FLINK-4500">FLINK-4500</a>).</p>
+
+<h2 id="release-notes---please-read">Release Notes - Please Read</h2>
+
+<p>Some of these changes will require updating the configuration or Maven dependencies for existing
+programs. Please read below to see if you might be affected.</p>
+
+<h3 id="changes-to-dynamic-class-loading-of-user-code">Changes to dynamic class loading of user code</h3>
+
+<p>As mentioned above, we changed the way Flink loads user code from the previous default of
+<em>parent-first class loading</em> (the default for Java) to <em>child-first classloading</em>, which is a common
+practice in Java Application Servers, where this is also referred to as inverted or reversed class
+loading.</p>
+
+<p>This should not affect regular user code but will enable programs to use a different version of
+dependencies that come with Flink – for example Akka, netty, or Jackson. If you want to change back
+to the previous default, you can use the configuration setting <code>classloader.resolve-order: parent-first</code>,
+the new default being <code>child-first</code>.</p>
+
+<h3 id="no-more-avro-dependency-included-by-default">No more Avro dependency included by default</h3>
+
+<p>Flink previously included Avro by default so user programs could simply use Avro and not worry about
+adding any dependencies. This behavior was changed in Flink 1.4 because it can lead to dependency
+clashes.</p>
+
+<p>You now must manually include the Avro dependency (<code>flink-avro</code>) with your program jar (or add it to
+the Flink lib folder) if you want to use Avro.</p>
+
+<h3 id="hadoop-free-flink-1">Hadoop-free Flink</h3>
+
+<p>Starting with version 1.4, Flink can run without any Hadoop dependencies present in the Classpath.
+Along with simply running without Hadoop, this enables Flink to dynamically use whatever Hadoop
+version is available in the classpath.</p>
+
+<p>You could, for example, download the Hadoop-free release of Flink but use that to run on any
+supported version of YARN, and Flink would dynamically use the Hadoop dependencies from YARN.</p>
+
+<p>This also means that in cases where you used connectors to HDFS, such as the <code>BucketingSink</code> or
+<code>RollingSink</code>, you now have to ensure that you either use a Flink distribution with bundled Hadoop
+dependencies or make sure to include Hadoop dependencies when building a jar file for your
+application.</p>
+
+<h2 id="list-of-contributors">List of Contributors</h2>
+
+<p>According to git shortlog, the following 106 people contributed to the 1.4.0 release. Thank you to
+all contributors!</p>
+
+<p>Ajay Tripathy, Alejandro Alcalde, Aljoscha Krettek, Bang, Phiradet, Bowen Li, Chris Ward, Cristian,
+Dan Kelley, David Anderson, Dawid Wysakowicz, Dian Fu, Dmitrii Kniazev, DmytroShkvyra, Fabian
+Hueske, FlorianFan, Fokko Driesprong, Gabor Gevay, Gary Yao, Greg Hogan, Haohui Mai, Hequn Cheng,
+James Lafa, Jark Wu, Jie Shen, Jing Fan, JingsongLi, Joerg Schad, Juan Paulo Gutierrez, Ken Geis,
+Kent Murra, Kurt Young, Lim Chee Hau, Maximilian Bode, Michael Fong, Mike Kobit, Mikhail Lipkovich,
+Nico Kruber, Novotnik, Petr, Nycholas de Oliveira e Oliveira, Patrick Lucas, Piotr Nowojski, Robert
+Metzger, Rodrigo Bonifacio, Rong Rong, Scott Kidder, Sebastian Klemke, Shuyi Chen, Stefan Richter,
+Stephan Ewen, Svend Vanderveken, Till Rohrmann, Tony Wei, Tzu-Li (Gordon) Tai, Ufuk Celebi, Usman
+Younas, Vetriselvan1187, Vishnu Viswanath, Wright, Eron, Xingcan Cui, Xpray, Yestin, Yonatan Most,
+Zhenzhong Xu, Zhijiang, adebski, asdf2014, bbayani, biao.liub, cactuslrd.lird, dawidwys, desktop,
+fengyelei, godfreyhe, gosubpl, gyao, hongyuhong, huafengw, kkloudas, kl0u, lincoln-lil,
+lingjinjiang, mengji.fy, minwenjun, mtunique, p1tz, paul, rtudoran, shaoxuan-wang, sirko
+bretschneider, sunjincheng121, tedyu, twalthr, uybhatti, wangmiao1981, yew1eb, z00376786, zentol,
+zhangminglei, zhe li, zhouhai02, zjureel, 付典, 军长, 宝牛, 淘江, 金竹</p>
+
+
+      </article>
+    </div>
+
+    <div class="row">
+      <div id="disqus_thread"></div>
+      <script type="text/javascript">
+        /* * * CONFIGURATION VARIABLES: EDIT BEFORE PASTING INTO YOUR WEBPAGE * * */
+        var disqus_shortname = 'stratosphere-eu'; // required: replace example with your forum shortname
+
+        /* * * DON'T EDIT BELOW THIS LINE * * */
+        (function() {
+            var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true;
+            dsq.src = '//' + disqus_shortname + '.disqus.com/embed.js';
+             (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq);
+        })();
+      </script>
+    </div>
+  </div>
+</div>
+      </div>
+    </div>
+
+    <hr />
+
+    <div class="row">
+      <div class="footer text-center col-sm-12">
+        <p>Copyright © 2014-2017 <a href="http://apache.org">The Apache Software Foundation</a>. All Rights Reserved.</p>
+        <p>Apache Flink, Flink®, Apache®, the squirrel logo, and the Apache feather logo are either registered trademarks or trademarks of The Apache Software Foundation.</p>
+        <p><a href="/privacy-policy.html">Privacy Policy</a> &middot; <a href="/blog/feed.xml">RSS feed</a></p>
+      </div>
+    </div>
+    </div><!-- /.container -->
+
+    <!-- Include all compiled plugins (below), or include individual files as needed -->
+    <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.4/js/bootstrap.min.js"></script>
+    <script src="/js/codetabs.js"></script>
+    <script src="/js/stickysidebar.js"></script>
+
+
+    <!-- Google Analytics -->
+    <script>
+      (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
+      (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
+      m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
+      })(window,document,'script','//www.google-analytics.com/analytics.js','ga');
+
+      ga('create', 'UA-52545728-1', 'auto');
+      ga('send', 'pageview');
+    </script>
+  </body>
+</html>