You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@bahir.apache.org by rm...@apache.org on 2017/03/01 10:42:28 UTC

bahir-flink git commit: [BAHIR-94] Add link to "Linking with Optional Modules" in READMEs

Repository: bahir-flink
Updated Branches:
  refs/heads/master c06d9d76a -> 9f306889f


[BAHIR-94] Add link to "Linking with Optional Modules" in READMEs

This commit also attempts to improve the overall quality of the Flink
connector docs (typos, extra empty lines, etc.) for the upcoming 1.0
release.

This closes #12


Project: http://git-wip-us.apache.org/repos/asf/bahir-flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/bahir-flink/commit/9f306889
Tree: http://git-wip-us.apache.org/repos/asf/bahir-flink/tree/9f306889
Diff: http://git-wip-us.apache.org/repos/asf/bahir-flink/diff/9f306889

Branch: refs/heads/master
Commit: 9f306889f3d02126aa89a5fdddf385a57adebe64
Parents: c06d9d7
Author: Tzu-Li (Gordon) Tai <tz...@apache.org>
Authored: Wed Mar 1 15:17:35 2017 +0800
Committer: Robert Metzger <rm...@apache.org>
Committed: Wed Mar 1 11:42:03 2017 +0100

----------------------------------------------------------------------
 README.md                          |  2 +-
 flink-connector-activemq/README.md |  8 +++-----
 flink-connector-akka/README.md     | 10 ++++++----
 flink-connector-flume/README.md    |  8 +++-----
 flink-connector-netty/README.md    | 28 +++++++++++++++-------------
 flink-connector-redis/README.md    |  6 ++----
 6 files changed, 30 insertions(+), 32 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/bahir-flink/blob/9f306889/README.md
----------------------------------------------------------------------
diff --git a/README.md b/README.md
index a3c5b8e..725c7b1 100644
--- a/README.md
+++ b/README.md
@@ -22,7 +22,7 @@ The community will review your changes, giving suggestions how to improve the co
 
 ## Building Bahir
 
-Bahir is built using [Apache Maven](http://maven.apache.org/).
+Bahir is built using [Apache Maven](http://maven.apache.org/)\u2122.
 To build Bahir and its example programs, run:
 
     mvn -DskipTests clean install

http://git-wip-us.apache.org/repos/asf/bahir-flink/blob/9f306889/flink-connector-activemq/README.md
----------------------------------------------------------------------
diff --git a/flink-connector-activemq/README.md b/flink-connector-activemq/README.md
index 69f08ca..1266ad2 100644
--- a/flink-connector-activemq/README.md
+++ b/flink-connector-activemq/README.md
@@ -1,10 +1,8 @@
-# Flink ActiveMQ connector
-
+# Flink ActiveMQ Connector
 
 This connector provides a source and sink to [Apache ActiveMQ](http://activemq.apache.org/)\u2122
 To use this connector, add the following dependency to your project:
 
-
     <dependency>
       <groupId>org.apache.bahir</groupId>
       <artifactId>flink-connector-activemq_2.11</artifactId>
@@ -14,6 +12,6 @@ To use this connector, add the following dependency to your project:
 *Version Compatibility*: This module is compatible with ActiveMQ 5.14.0.
 
 Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
 
-
-The source class is called `AMQSource`, the sink is `AMQSink`.
+The source class is called `AMQSource`, and the sink is `AMQSink`.

http://git-wip-us.apache.org/repos/asf/bahir-flink/blob/9f306889/flink-connector-akka/README.md
----------------------------------------------------------------------
diff --git a/flink-connector-akka/README.md b/flink-connector-akka/README.md
index a0bd7ee..cc45f51 100644
--- a/flink-connector-akka/README.md
+++ b/flink-connector-akka/README.md
@@ -1,9 +1,8 @@
-# Flink Akka connector
+# Flink Akka Connector
 
 This connector provides a sink to [Akka](http://akka.io/) source actors in an ActorSystem.
 To use this connector, add the following dependency to your project:
 
-
     <dependency>
       <groupId>org.apache.bahir</groupId>
       <artifactId>flink-connector-akka_2.11</artifactId>
@@ -11,6 +10,9 @@ To use this connector, add the following dependency to your project:
     </dependency>
     
 *Version Compatibility*: This module is compatible with Akka 2.0+.
+
+Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
     
 ## Configuration
     
@@ -28,10 +30,10 @@ A sample configuration can be defined as follows:
     
 ## Message Types
     
-There are 3 different kind of message types which the receiver Actor in flink akka connector can receive.
+There are 3 different kind of message types which the receiver Actor in Flink Akka connector can receive.
     
 - message containing `Iterable<Object>` data
    
 - message containing generic `Object` data
    
-- message containing generic `Object` data and a `Timestamp` value passed as `Tuple2<Object, Long>`.   
\ No newline at end of file
+- message containing generic `Object` data and a `Timestamp` value passed as `Tuple2<Object, Long>`.

http://git-wip-us.apache.org/repos/asf/bahir-flink/blob/9f306889/flink-connector-flume/README.md
----------------------------------------------------------------------
diff --git a/flink-connector-flume/README.md b/flink-connector-flume/README.md
index 69688b2..69468ba 100644
--- a/flink-connector-flume/README.md
+++ b/flink-connector-flume/README.md
@@ -1,10 +1,8 @@
-# Flink Flume connector
+# Flink Flume Connector
 
-
-This connector provides a Sink that can send data to [Apache Flume](https://flume.apache.org/)\u2122. To use this connector, add the
+This connector provides a sink that can send data to [Apache Flume](https://flume.apache.org/)\u2122. To use this connector, add the
 following dependency to your project:
 
-
     <dependency>
       <groupId>org.apache.bahir</groupId>
       <artifactId>flink-connector-flume_2.11</artifactId>
@@ -14,7 +12,7 @@ following dependency to your project:
 *Version Compatibility*: This module is compatible with Flume 1.5.0.
 
 Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
-
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
 
 To create a `FlumeSink` instantiate the following constructor:
 

http://git-wip-us.apache.org/repos/asf/bahir-flink/blob/9f306889/flink-connector-netty/README.md
----------------------------------------------------------------------
diff --git a/flink-connector-netty/README.md b/flink-connector-netty/README.md
index 38cf10d..eda5770 100644
--- a/flink-connector-netty/README.md
+++ b/flink-connector-netty/README.md
@@ -1,8 +1,11 @@
-#   Flink Netty Connector
+# Flink Netty Connector
 
-This connector provide tcp source and http source for receiving push data, implemented by [Netty](http://netty.io). 
+This connector provides tcp source and http source for receiving push data, implemented by [Netty](http://netty.io). 
 
-##  Data Flow
+Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
+
+## Data Flow
 
 ```
 +-------------+      (2)    +------------------------+
@@ -17,14 +20,14 @@ This connector provide tcp source and http source for receiving push data, imple
 +--------------------+         (1)
 ```
 
-There are three component:
+There are three components:
 
-*   User System - where the data streaming come from
-*   Third Register Service - receive `Flink Netty Source`'s register request(ip and port)
+*   User System - where the data stream is coming from
+*   Third Register Service - receive `Flink Netty Source`'s register request (ip and port)
 *   Flink Netty Source - Netty Server for receiving pushed streaming data from `User System`
 
 
-##   Maven Dependency
+## Maven Dependency
 To use this connector, add the following dependency to your project:
 
 ```
@@ -35,7 +38,7 @@ To use this connector, add the following dependency to your project:
 </dependency>
 ```
 
-##  Usage
+## Usage
 
 *Tcp Source:*
 
@@ -43,7 +46,7 @@ To use this connector, add the following dependency to your project:
 val env = StreamExecutionEnvironment.getExecutionEnvironment
 env.addSource(new TcpReceiverSource("msg", 7070, Some("http://localhost:9090/cb")))
 ```
->paramKey:  the http query param key    
+>paramKey:  the http query param key
 >tryPort:   try to use this point, if this point is used then try a new port
 >callbackUrl:   register connector's ip and port to a `Third Register Service`
 
@@ -53,13 +56,12 @@ env.addSource(new TcpReceiverSource("msg", 7070, Some("http://localhost:9090/cb"
 val env = StreamExecutionEnvironment.getExecutionEnvironment
 env.addSource(new TcpReceiverSource(7070, Some("http://localhost:9090/cb")))
 ```
->tryPort:   try to use this point, if this point is used then try a new port
+>tryPort:   try to use this port, if this point is used then try a new port
 >callbackUrl:   register connector's ip and port to a `Third Register Service`
 
-##  full example 
+## Full Example 
 
-There are two example for get start:
+There are two example to get started:
 
 *   [StreamSqlExample](https://github.com/apache/bahir-flink/blob/master/flink-connector-netty/src/test/scala/org/apache/flink/streaming/connectors/netty/example/StreamSqlExample.scala)
 *   [TcpSourceExample](https://github.com/apache/bahir-flink/blob/master/flink-connector-netty/src/test/scala/org/apache/flink/streaming/connectors/netty/example/TcpSourceExample.scala)
-

http://git-wip-us.apache.org/repos/asf/bahir-flink/blob/9f306889/flink-connector-redis/README.md
----------------------------------------------------------------------
diff --git a/flink-connector-redis/README.md b/flink-connector-redis/README.md
index cd4cf35..0748a92 100644
--- a/flink-connector-redis/README.md
+++ b/flink-connector-redis/README.md
@@ -1,11 +1,9 @@
-# Flink Redis connector
-
+# Flink Redis Connector
 
 This connector provides a Sink that can write to [Redis](http://redis.io/) and also can publish data 
 to [Redis PubSub](http://redis.io/topics/pubsub). To use this connector, add the
 following dependency to your project:
 
-
     <dependency>
       <groupId>org.apache.bahir</groupId>
       <artifactId>flink-connector-redis_2.11</artifactId>
@@ -15,7 +13,7 @@ following dependency to your project:
 *Version Compatibility*: This module is compatible with Redis 2.8.5.
 
 Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
-
+See how to link with them for cluster execution [here](https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html).
 
 ## Installing Redis