You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by mb...@apache.org on 2015/06/15 11:32:58 UTC
[08/27] flink git commit: [storm-compat] Added README files to Storm
compatibility modules
[storm-compat] Added README files to Storm compatibility modules
Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/9ff3cf06
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/9ff3cf06
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/9ff3cf06
Branch: refs/heads/master
Commit: 9ff3cf06c187207248db2a8a5a19e2b9c39f3549
Parents: fa13e49
Author: mjsax <mj...@informatik.hu-berlin.de>
Authored: Thu May 14 13:04:01 2015 +0200
Committer: mbalassi <mb...@apache.org>
Committed: Sun Jun 14 22:59:46 2015 +0200
----------------------------------------------------------------------
.../flink-storm-compatibility/README.md | 15 +++++++++++++++
.../flink-streaming/flink-storm-examples/README.md | 15 +++++++++++++++
2 files changed, 30 insertions(+)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/flink/blob/9ff3cf06/flink-staging/flink-streaming/flink-storm-compatibility/README.md
----------------------------------------------------------------------
diff --git a/flink-staging/flink-streaming/flink-storm-compatibility/README.md b/flink-staging/flink-streaming/flink-storm-compatibility/README.md
new file mode 100644
index 0000000..0d490a3
--- /dev/null
+++ b/flink-staging/flink-streaming/flink-storm-compatibility/README.md
@@ -0,0 +1,15 @@
+# flink-storm-compatibility
+
+The Storm compatibility layer allows to embed spouts or bolt unmodified within a regular Flink streaming program (`StormSpoutWrapper` and `StormBoltWrapper`). Additionally, a whole Storm topology can be submitted to Flink (see `FlinkTopologyBuilder`, `FlinkLocalCluster`, and `FlinkSubmitter`). Only a few minor changes to the original submitting code are required. The code that builds the topology itself, can be reused unmodified. See `flink-storm-examples` for a simple word-count example.
+
+The following Strom features are not (yet/fully) supported by the compatibility layer right now:
+* the spout/bolt configuration within `open()`/`prepare()` is not yet supported (ie, `Map conf` parameter)
+* topology and tuple meta information (ie, `TopologyContext` not fully supported)
+* access to tuple attributes (ie, fields) only by index (access by name is coming)
+* only default stream is supported currently (ie, only a single output stream)
+* no fault-tolerance guarantees (ie, calls to `ack()`/`fail()` and anchoring is ignored)
+* for whole Storm topologies the following is not supported by Flink:
+ * direct emit connection pattern
+ * activating/deactivating and rebalancing of topologies
+ * task hooks
+ * custom metrics
http://git-wip-us.apache.org/repos/asf/flink/blob/9ff3cf06/flink-staging/flink-streaming/flink-storm-examples/README.md
----------------------------------------------------------------------
diff --git a/flink-staging/flink-streaming/flink-storm-examples/README.md b/flink-staging/flink-streaming/flink-storm-examples/README.md
new file mode 100644
index 0000000..a4d8885
--- /dev/null
+++ b/flink-staging/flink-streaming/flink-storm-examples/README.md
@@ -0,0 +1,15 @@
+# flink-storm-examples
+
+This module contains three versions of a simple word-count-example to illustrate the usage of the compatibility layer:
+* the usage of spouts or bolt within a regular Flink streaming program (ie, embedded spouts or bolts)
+ 1. `SpoutSourceWordCount` uses a spout as data source within a Flink streaming program
+ 2. `BoltTokenizeerWordCount` uses a bolt to split sentences into words within a Flink streaming program
+* how to submit a whole Storm topology to Flink
+ 3. `WordCountTopology` plugs a Storm topology together
+ * `StormWordCountLocal` submits the topology to a local Flink cluster (similiar to a `LocalCluster` in Storm)
+ * `StormWordCountRemoteByClient` submits the topology to a remote Flink cluster (simliar to the usage of `NimbusClient` in Storm)
+ * `StormWordCountRemoteBySubmitter` submits the topology to a remote Flink cluster (simliar to the usage of `StormSubmitter` in Storm)
+
+Additionally, this module package the three examples word-count programs as jar files to be submitted to a Flink cluster via `bin/flink run example.jar`.
+
+The package `org.apache.flink.stormcompatiblitly.stormoperators` contain original Storm spouts and bolts that can be used unmodified within Storm or Flink.