You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@bahir.apache.org by lr...@apache.org on 2017/07/17 17:20:00 UTC

[4/4] bahir-website git commit: Update Bahir Flink extension documentations

Update Bahir Flink extension documentations

Add documentation for release 1.0
Update with latest contents for current documentation


Project: http://git-wip-us.apache.org/repos/asf/bahir-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/bahir-website/commit/6d16a5c3
Tree: http://git-wip-us.apache.org/repos/asf/bahir-website/tree/6d16a5c3
Diff: http://git-wip-us.apache.org/repos/asf/bahir-website/diff/6d16a5c3

Branch: refs/heads/master
Commit: 6d16a5c37065c70c06953838f0d76469926967a0
Parents: 21c1d62
Author: Luciano Resende <lr...@apache.org>
Authored: Mon Jul 17 10:18:45 2017 -0700
Committer: Luciano Resende <lr...@apache.org>
Committed: Mon Jul 17 10:18:45 2017 -0700

----------------------------------------------------------------------
 site/docs/flink/1.0/documentation.md            |  42 +++++
 site/docs/flink/1.0/flink-streaming-activemq.md |  44 +++++
 site/docs/flink/1.0/flink-streaming-akka.md     |  66 +++++++
 site/docs/flink/1.0/flink-streaming-flume.md    |  46 +++++
 site/docs/flink/1.0/flink-streaming-netty.md    |  94 ++++++++++
 site/docs/flink/1.0/flink-streaming-redis.md    | 176 +++++++++++++++++++
 .../docs/flink/current/flink-streaming-redis.md |   5 +-
 site/docs/flink/overview.md                     |   2 +-
 8 files changed, 473 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/bahir-website/blob/6d16a5c3/site/docs/flink/1.0/documentation.md
----------------------------------------------------------------------
diff --git a/site/docs/flink/1.0/documentation.md b/site/docs/flink/1.0/documentation.md
new file mode 100644
index 0000000..55b557f
--- /dev/null
+++ b/site/docs/flink/1.0/documentation.md
@@ -0,0 +1,42 @@
+---
+layout: page
+title: Extensions for Apache Flink (1.0.0-SNAPSHOT)
+description: Extensions for Apache Flink (1.0.0-SNAPSHOT)
+group: nav-right
+---
+<!--
+{% comment %}
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to you under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+{% endcomment %}
+-->
+
+{% include JB/setup %}
+
+### Apache Bahir Extensions for Apache Flink
+
+<br/>
+
+#### Streaming Connectors
+
+[ActiveMQ connector](../flink-streaming-activemq)
+
+[Akka connector](../flink-streaming-akka)
+
+[Flume connector](../flink-streaming-flume)
+
+[Netty connector](../flink-streaming-netty)
+
+[Redis connector](../flink-streaming-redis)

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/6d16a5c3/site/docs/flink/1.0/flink-streaming-activemq.md
----------------------------------------------------------------------
diff --git a/site/docs/flink/1.0/flink-streaming-activemq.md b/site/docs/flink/1.0/flink-streaming-activemq.md
new file mode 100644
index 0000000..47e69b1
--- /dev/null
+++ b/site/docs/flink/1.0/flink-streaming-activemq.md
@@ -0,0 +1,44 @@
+---
+layout: page
+title: Apache Flink Streaming Connector for ActiveMQ
+description: Apache Flink Streaming Connector for ActiveMQ
+group: nav-right
+---
+<!--
+{% comment %}
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to you under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+{% endcomment %}
+-->
+
+{% include JB/setup %}
+
+# Flink ActiveMQ Connector
+
+This connector provides a source and sink to [Apache ActiveMQ](http://activemq.apache.org/)™
+To use this connector, add the following dependency to your project:
+
+    <dependency>
+      <groupId>org.apache.bahir</groupId>
+      <artifactId>flink-connector-activemq_2.11</artifactId>
+      <version>1.0</version>
+    </dependency>
+
+*Version Compatibility*: This module is compatible with ActiveMQ 5.14.0.
+
+Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
+
+The source class is called `AMQSource`, and the sink is `AMQSink`.

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/6d16a5c3/site/docs/flink/1.0/flink-streaming-akka.md
----------------------------------------------------------------------
diff --git a/site/docs/flink/1.0/flink-streaming-akka.md b/site/docs/flink/1.0/flink-streaming-akka.md
new file mode 100644
index 0000000..93dea74
--- /dev/null
+++ b/site/docs/flink/1.0/flink-streaming-akka.md
@@ -0,0 +1,66 @@
+---
+layout: page
+title: Apache Flink Streaming Connector for Akka
+description: Apache Flink Streaming Connector for Akka
+group: nav-right
+---
+<!--
+{% comment %}
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to you under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+{% endcomment %}
+-->
+
+{% include JB/setup %}
+
+# Flink Akka Connector
+
+This connector provides a sink to [Akka](http://akka.io/) source actors in an ActorSystem.
+To use this connector, add the following dependency to your project:
+
+    <dependency>
+      <groupId>org.apache.bahir</groupId>
+      <artifactId>flink-connector-akka_2.11</artifactId>
+      <version>1.0</version>
+    </dependency>
+
+*Version Compatibility*: This module is compatible with Akka 2.0+.
+
+Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
+
+## Configuration
+
+The configurations for the Receiver Actor System in Flink Akka connector can be created using the standard typesafe `Config (com.typesafe.config.Config)` object.
+
+To enable acknowledgements, the custom configuration `akka.remote.auto-ack` can be used.
+
+The user can set any of the default configurations allowed by Akka as well as custom configurations allowed by the connector.
+
+A sample configuration can be defined as follows:
+
+    String configFile = getClass().getClassLoader()
+          .getResource("feeder_actor.conf").getFile();
+    Config config = ConfigFactory.parseFile(new File(configFile));    
+
+## Message Types
+
+There are 3 different kind of message types which the receiver Actor in Flink Akka connector can receive.
+
+- message containing `Iterable<Object>` data
+
+- message containing generic `Object` data
+
+- message containing generic `Object` data and a `Timestamp` value passed as `Tuple2<Object, Long>`.

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/6d16a5c3/site/docs/flink/1.0/flink-streaming-flume.md
----------------------------------------------------------------------
diff --git a/site/docs/flink/1.0/flink-streaming-flume.md b/site/docs/flink/1.0/flink-streaming-flume.md
new file mode 100644
index 0000000..3f4b471
--- /dev/null
+++ b/site/docs/flink/1.0/flink-streaming-flume.md
@@ -0,0 +1,46 @@
+---
+layout: page
+title: Apache Flink Streaming Connector for Apache Flume
+description: Apache Flink Streaming Connector for Apache Flume
+group: nav-right
+---
+<!--
+{% comment %}
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to you under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+{% endcomment %}
+-->
+
+{% include JB/setup %}
+
+# Flink Flume Connector
+
+This connector provides a sink that can send data to [Apache Flume](https://flume.apache.org/)™. To use this connector, add the
+following dependency to your project:
+
+    <dependency>
+      <groupId>org.apache.bahir</groupId>
+      <artifactId>flink-connector-flume_2.11</artifactId>
+      <version>1.0</version>
+    </dependency>
+
+*Version Compatibility*: This module is compatible with Flume 1.5.0.
+
+Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
+
+To create a `FlumeSink` instantiate the following constructor:
+
+    FlumeSink(String host, int port, SerializationSchema<IN> schema)

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/6d16a5c3/site/docs/flink/1.0/flink-streaming-netty.md
----------------------------------------------------------------------
diff --git a/site/docs/flink/1.0/flink-streaming-netty.md b/site/docs/flink/1.0/flink-streaming-netty.md
new file mode 100644
index 0000000..1942159
--- /dev/null
+++ b/site/docs/flink/1.0/flink-streaming-netty.md
@@ -0,0 +1,94 @@
+---
+layout: page
+title: Apache Flink Streaming Connector for Netty
+description: Apache Flink Streaming Connector for Netty
+group: nav-right
+---
+<!--
+{% comment %}
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to you under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+{% endcomment %}
+-->
+
+{% include JB/setup %}
+
+# Flink Netty Connector
+
+This connector provides tcp source and http source for receiving push data, implemented by [Netty](http://netty.io).
+
+Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
+
+## Data Flow
+
+```
++-------------+      (2)    +------------------------+
+| user system |    <-----   | Third Register Service |           
++-------------+             +------------------------+
+       |                                ^
+       | (3)                            |
+       |                                |
+       V                                |
++--------------------+                  |
+| Flink Netty Source |  ----------------+
++--------------------+         (1)
+```
+
+There are three components:
+
+*   User System - where the data stream is coming from
+*   Third Register Service - receive `Flink Netty Source`'s register request (ip and port)
+*   Flink Netty Source - Netty Server for receiving pushed streaming data from `User System`
+
+
+## Maven Dependency
+To use this connector, add the following dependency to your project:
+
+```
+<dependency>
+  <groupId>org.apache.bahir</groupId>
+  <artifactId>flink-connector-netty_2.11</artifactId>
+  <version>1.0</version>
+</dependency>
+```
+
+## Usage
+
+*Tcp Source:*
+
+```
+val env = StreamExecutionEnvironment.getExecutionEnvironment
+env.addSource(new TcpReceiverSource("msg", 7070, Some("http://localhost:9090/cb")))
+```
+>paramKey:  the http query param key
+>tryPort:   try to use this point, if this point is used then try a new port
+>callbackUrl:   register connector's ip and port to a `Third Register Service`
+
+*Http Source:*
+
+```
+val env = StreamExecutionEnvironment.getExecutionEnvironment
+env.addSource(new TcpReceiverSource(7070, Some("http://localhost:9090/cb")))
+```
+>tryPort:   try to use this port, if this point is used then try a new port
+>callbackUrl:   register connector's ip and port to a `Third Register Service`
+
+## Full Example
+
+There are two example to get started:
+
+*   [StreamSqlExample](https://github.com/apache/bahir-flink/blob/master/flink-connector-netty/src/test/scala/org/apache/flink/streaming/connectors/netty/example/StreamSqlExample.scala)
+*   [TcpSourceExample](https://github.com/apache/bahir-flink/blob/master/flink-connector-netty/src/test/scala/org/apache/flink/streaming/connectors/netty/example/TcpSourceExample.scala)

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/6d16a5c3/site/docs/flink/1.0/flink-streaming-redis.md
----------------------------------------------------------------------
diff --git a/site/docs/flink/1.0/flink-streaming-redis.md b/site/docs/flink/1.0/flink-streaming-redis.md
new file mode 100644
index 0000000..5b551e2
--- /dev/null
+++ b/site/docs/flink/1.0/flink-streaming-redis.md
@@ -0,0 +1,176 @@
+---
+layout: page
+title: Apache Flink Streaming Connector for Redis
+description: Apache Flink Streaming Connector for Redis
+group: nav-right
+---
+<!--
+{% comment %}
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to you under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+{% endcomment %}
+-->
+
+{% include JB/setup %}
+
+# Flink Redis Connector
+
+This connector provides a Sink that can write to [Redis](http://redis.io/) and also can publish data
+to [Redis PubSub](http://redis.io/topics/pubsub). To use this connector, add the
+following dependency to your project:
+
+    <dependency>
+      <groupId>org.apache.bahir</groupId>
+      <artifactId>flink-connector-redis_2.11</artifactId>
+      <version>1.0</version>
+    </dependency>
+
+*Version Compatibility*: This module is compatible with Redis 2.8.5.
+
+Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
+
+## Installing Redis
+
+Follow the instructions from the [Redis download page](http://redis.io/download).
+
+
+## Redis Sink
+
+A class providing an interface for sending data to Redis.
+The sink can use three different methods for communicating with different type of Redis environments:
+
+1. Single Redis Server
+2. Redis Cluster
+3. Redis Sentinel
+
+This code shows how to create a sink that communicate to a single redis server:
+
+**Java:**
+
+
+    public static class RedisExampleMapper implements RedisMapper<Tuple2<String, String>>{
+
+        @Override
+        public RedisCommandDescription getCommandDescription() {
+            return new RedisCommandDescription(RedisCommand.HSET, "HASH_NAME");
+        }
+
+        @Override
+        public String getKeyFromData(Tuple2<String, String> data) {
+            return data.f0;
+        }
+
+        @Override
+        public String getValueFromData(Tuple2<String, String> data) {
+            return data.f1;
+        }
+    }
+    FlinkJedisPoolConfig conf = new FlinkJedisPoolConfig.Builder().setHost("127.0.0.1").build();
+
+    DataStream<String> stream = ...;
+    stream.addSink(new RedisSink<Tuple2<String, String>>(conf, new RedisExampleMapper());
+
+
+
+**Scala:**
+
+    class RedisExampleMapper extends RedisMapper[(String, String)]{
+      override def getCommandDescription: RedisCommandDescription = {
+        new RedisCommandDescription(RedisCommand.HSET, "HASH_NAME")
+      }
+
+      override def getKeyFromData(data: (String, String)): String = data._1
+
+      override def getValueFromData(data: (String, String)): String = data._2
+    }
+    val conf = new FlinkJedisPoolConfig.Builder().setHost("127.0.0.1").build()
+    stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
+
+
+
+This example code does the same, but for Redis Cluster:
+
+**Java:**
+
+    FlinkJedisPoolConfig conf = new FlinkJedisPoolConfig.Builder()
+        .setNodes(new HashSet<InetSocketAddress>(Arrays.asList(new InetSocketAddress(5601)))).build();
+
+    DataStream<String> stream = ...;
+    stream.addSink(new RedisSink<Tuple2<String, String>>(conf, new RedisExampleMapper());
+
+**Scala:**
+
+
+    val conf = new FlinkJedisPoolConfig.Builder().setNodes(...).build()
+    stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
+
+
+This example shows when the Redis environment is with Sentinels:
+
+Java:
+
+    FlinkJedisSentinelConfig conf = new FlinkJedisSentinelConfig.Builder()
+        .setMasterName("master").setSentinels(...).build();
+
+    DataStream<String> stream = ...;
+    stream.addSink(new RedisSink<Tuple2<String, String>>(conf, new RedisExampleMapper());
+
+
+Scala:
+
+    val conf = new FlinkJedisSentinelConfig.Builder().setMasterName("master").setSentinels(...).build()
+    stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
+
+
+This section gives a description of all the available data types and what Redis command used for that.
+
+<table class="table table-bordered" style="width: 75%">
+    <thead>
+        <tr>
+          <th class="text-center" style="width: 20%">Data Type</th>
+          <th class="text-center" style="width: 25%">Redis Command [Sink]</th>
+        </tr>
+      </thead>
+      <tbody>
+        <tr>
+            <td>HASH</td><td><a href="http://redis.io/commands/hset">HSET</a></td>
+        </tr>
+        <tr>
+            <td>LIST</td><td>
+                <a href="http://redis.io/commands/rpush">RPUSH</a>,
+                <a href="http://redis.io/commands/lpush">LPUSH</a>
+            </td>
+        </tr>
+        <tr>
+            <td>SET</td><td><a href="http://redis.io/commands/rpush">SADD</a></td>
+        </tr>
+        <tr>
+            <td>PUBSUB</td><td><a href="http://redis.io/commands/publish">PUBLISH</a></td>
+        </tr>
+        <tr>
+            <td>STRING</td><td><a href="http://redis.io/commands/set">SET</a></td>
+        </tr>
+        <tr>
+            <td>HYPER_LOG_LOG</td><td><a href="http://redis.io/commands/pfadd">PFADD</a></td>
+        </tr>
+        <tr>
+            <td>SORTED_SET</td><td><a href="http://redis.io/commands/zadd">ZADD</a></td>
+        </tr>
+        <tr>
+            <td>SORTED_SET</td><td><a href="http://redis.io/commands/zrem">ZREM</a></td>
+        </tr>
+      </tbody>
+</table>

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/6d16a5c3/site/docs/flink/current/flink-streaming-redis.md
----------------------------------------------------------------------
diff --git a/site/docs/flink/current/flink-streaming-redis.md b/site/docs/flink/current/flink-streaming-redis.md
index de306c5..01db8c9 100644
--- a/site/docs/flink/current/flink-streaming-redis.md
+++ b/site/docs/flink/current/flink-streaming-redis.md
@@ -168,6 +168,9 @@ This section gives a description of all the available data types and what Redis
         </tr>
         <tr>
             <td>SORTED_SET</td><td><a href="http://redis.io/commands/zadd">ZADD</a></td>
-        </tr>                
+        </tr>
+        <tr>
+            <td>SORTED_SET</td><td><a href="http://redis.io/commands/zrem">ZREM</a></td>
+        </tr>
       </tbody>
 </table>

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/6d16a5c3/site/docs/flink/overview.md
----------------------------------------------------------------------
diff --git a/site/docs/flink/overview.md b/site/docs/flink/overview.md
index e42f12c..86a4b60 100644
--- a/site/docs/flink/overview.md
+++ b/site/docs/flink/overview.md
@@ -28,4 +28,4 @@ limitations under the License.
 ### Apache Bahir Extensions for Apache Flink
 
  - [Current - 1.0-SNAPSHOT](/docs/flink/current/documentation)
- 
\ No newline at end of file
+ - [1.0](/docs/flink/1.0/documentation)