You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@bahir.apache.org by lr...@apache.org on 2016/08/03 19:12:33 UTC

[1/4] bahir git commit: [BAHIR-28] Add basic documentation for Akka connector

Repository: bahir
Updated Branches:
  refs/heads/master 5e07303c6 -> 29d8c7622


[BAHIR-28] Add basic documentation for Akka connector


Project: http://git-wip-us.apache.org/repos/asf/bahir/repo
Commit: http://git-wip-us.apache.org/repos/asf/bahir/commit/858ad27a
Tree: http://git-wip-us.apache.org/repos/asf/bahir/tree/858ad27a
Diff: http://git-wip-us.apache.org/repos/asf/bahir/diff/858ad27a

Branch: refs/heads/master
Commit: 858ad27ad2766ab85e8c41e0cfa45162e9f7e308
Parents: 5e07303
Author: Luciano Resende <lr...@apache.org>
Authored: Mon Aug 1 19:17:02 2016 +0300
Committer: Luciano Resende <lr...@apache.org>
Committed: Wed Aug 3 22:10:59 2016 +0300

----------------------------------------------------------------------
 streaming-akka/README.md | 73 +++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 73 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/bahir/blob/858ad27a/streaming-akka/README.md
----------------------------------------------------------------------
diff --git a/streaming-akka/README.md b/streaming-akka/README.md
new file mode 100644
index 0000000..db85cdd
--- /dev/null
+++ b/streaming-akka/README.md
@@ -0,0 +1,73 @@
+
+A library for reading data from Akka Actors using Spark Streaming. 
+
+## Linking
+
+Using SBT:
+
+```
+libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.0.0"
+```
+
+Using Maven:
+
+```xml
+<dependency>
+    <groupId>org.apache.bahir</groupId>
+    <artifactId>spark-streaming-akka_2.11</artifactId>
+    <version>2.0.0</version>
+</dependency>
+```
+
+This library can also be added to Spark jobs launched through `spark-shell` or `spark-submit` by using the `--packages` command line option.
+For example, to include it when starting the spark shell:
+
+```
+$ bin/spark-shell --packages org.apache.bahir:spark-streaming_akka_2.11:2.0.0
+```
+
+Unlike using `--jars`, using `--packages` ensures that this library and its dependencies will be added to the classpath.
+The `--packages` argument can also be used with `bin/spark-submit`.
+
+This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.
+
+## Examples
+
+DStreams can be created with data streams received through Akka actors by using `AkkaUtils.createStream(ssc, actorProps, actor-name)`.
+
+### Scala API
+
+You need to extend `ActorReceiver` so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
+this actor can be configured to handle failures, etc.
+
+```Scala
+class CustomActor extends ActorReceiver {
+  def receive = {
+    case data: String => store(data)
+  }
+}
+
+// A new input stream can be created with this custom actor as
+val ssc: StreamingContext = ...
+val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
+```
+
+### Java API
+
+You need to extend `JavaActorReceiver` so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
+this actor can be configured to handle failures, etc.
+
+```Java
+class CustomActor extends JavaActorReceiver {
+  @Override
+  public void onReceive(Object msg) throws Exception {
+    store((String) msg);
+  }
+}
+
+// A new input stream can be created with this custom actor as
+JavaStreamingContext jssc = ...;
+JavaDStream<String> lines = AkkaUtils.<String>createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
+```
+
+See end-to-end examples at ([Akka Examples](https://github.com/apache/bahir/tree/master/streaming-akka/examples)


[4/4] bahir git commit: [BAHIR-31] Add basic documentation for ZeroMQ connector

Posted by lr...@apache.org.
[BAHIR-31] Add basic documentation for ZeroMQ connector


Project: http://git-wip-us.apache.org/repos/asf/bahir/repo
Commit: http://git-wip-us.apache.org/repos/asf/bahir/commit/29d8c762
Tree: http://git-wip-us.apache.org/repos/asf/bahir/tree/29d8c762
Diff: http://git-wip-us.apache.org/repos/asf/bahir/diff/29d8c762

Branch: refs/heads/master
Commit: 29d8c7622cf9663e295d7616ae1e1b089fe80da9
Parents: c78af70
Author: Luciano Resende <lr...@apache.org>
Authored: Mon Aug 1 19:21:24 2016 +0300
Committer: Luciano Resende <lr...@apache.org>
Committed: Wed Aug 3 22:11:57 2016 +0300

----------------------------------------------------------------------
 streaming-zeromq/README.md | 49 +++++++++++++++++++++++++++++++++++++++++
 1 file changed, 49 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/bahir/blob/29d8c762/streaming-zeromq/README.md
----------------------------------------------------------------------
diff --git a/streaming-zeromq/README.md b/streaming-zeromq/README.md
new file mode 100644
index 0000000..6a8a069
--- /dev/null
+++ b/streaming-zeromq/README.md
@@ -0,0 +1,49 @@
+
+A library for reading data from [ZeroMQ](http://zeromq.org/) using Spark Streaming. 
+
+## Linking
+
+Using SBT:
+
+```
+libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.0.0"
+```
+
+Using Maven:
+
+```xml
+<dependency>
+    <groupId>org.apache.bahir</groupId>
+    <artifactId>spark-streaming-zeromq_2.11</artifactId>
+    <version>2.0.0</version>
+</dependency>
+```
+
+This library can also be added to Spark jobs launched through `spark-shell` or `spark-submit` by using the `--packages` command line option.
+For example, to include it when starting the spark shell:
+
+```
+$ bin/spark-shell --packages org.apache.bahir:spark-streaming_zeromq_2.11:2.0.0
+```
+
+Unlike using `--jars`, using `--packages` ensures that this library and its dependencies will be added to the classpath.
+The `--packages` argument can also be used with `bin/spark-submit`.
+
+This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.
+
+## Examples
+
+
+### Scala API
+
+```Scala
+val lines = ZeroMQUtils.createStream(ssc, ...)
+```
+
+### Java API
+
+```Java
+JavaDStream<String> lines = ZeroMQUtils.createStream(jssc, ...);
+```
+
+See end-to-end examples at ([ZeroMQ Examples](https://github.com/apache/bahir/tree/master/streaming-zeromq/examples)
\ No newline at end of file


[3/4] bahir git commit: [BAHIR-30] Add basic documentation for Twitter connector

Posted by lr...@apache.org.
[BAHIR-30] Add basic documentation for Twitter connector


Project: http://git-wip-us.apache.org/repos/asf/bahir/repo
Commit: http://git-wip-us.apache.org/repos/asf/bahir/commit/c78af705
Tree: http://git-wip-us.apache.org/repos/asf/bahir/tree/c78af705
Diff: http://git-wip-us.apache.org/repos/asf/bahir/diff/c78af705

Branch: refs/heads/master
Commit: c78af705f5697ab11d93f933d033d96cc48403a0
Parents: 619936d
Author: Luciano Resende <lr...@apache.org>
Authored: Mon Aug 1 19:20:20 2016 +0300
Committer: Luciano Resende <lr...@apache.org>
Committed: Wed Aug 3 22:11:43 2016 +0300

----------------------------------------------------------------------
 streaming-twitter/README.md | 58 ++++++++++++++++++++++++++++++++++++++++
 1 file changed, 58 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/bahir/blob/c78af705/streaming-twitter/README.md
----------------------------------------------------------------------
diff --git a/streaming-twitter/README.md b/streaming-twitter/README.md
new file mode 100644
index 0000000..6c16438
--- /dev/null
+++ b/streaming-twitter/README.md
@@ -0,0 +1,58 @@
+
+A library for reading social data from [twitter](http://twitter.com/) using Spark Streaming. 
+
+## Linking
+
+Using SBT:
+
+```
+libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.0.0"
+```
+
+Using Maven:
+
+```xml
+<dependency>
+    <groupId>org.apache.bahir</groupId>
+    <artifactId>spark-streaming-twitter_2.11</artifactId>
+    <version>2.0.0</version>
+</dependency>
+```
+
+This library can also be added to Spark jobs launched through `spark-shell` or `spark-submit` by using the `--packages` command line option.
+For example, to include it when starting the spark shell:
+
+```
+$ bin/spark-shell --packages org.apache.bahir:spark-streaming_twitter_2.11:2.0.0
+```
+
+Unlike using `--jars`, using `--packages` ensures that this library and its dependencies will be added to the classpath.
+The `--packages` argument can also be used with `bin/spark-submit`.
+
+This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.
+
+
+## Examples
+
+`TwitterUtils` uses Twitter4j to get the public stream of tweets using [Twitter's Streaming API](https://dev.twitter.com/docs/streaming-apis). Authentication information
+can be provided by any of the [methods](http://twitter4j.org/en/configuration.html) supported by Twitter4J library. You can import the `TwitterUtils` class and create a DStream with `TwitterUtils.createStream` as shown below.
+
+### Scala API
+
+```Scala
+import org.apache.spark.streaming.twitter._
+
+TwitterUtils.createStream(ssc, None)
+```
+
+### Java API
+
+```Java
+import org.apache.spark.streaming.twitter.*;
+
+TwitterUtils.createStream(jssc);
+```
+
+
+You can also either get the public stream, or get the filtered stream based on keywords. 
+See end-to-end examples at ([Twitter Examples](https://github.com/apache/bahir/tree/master/streaming-twitter/examples)
\ No newline at end of file


[2/4] bahir git commit: [BAHIR-29] Add basic documentation for MQTT Connector

Posted by lr...@apache.org.
[BAHIR-29] Add basic documentation for MQTT Connector


Project: http://git-wip-us.apache.org/repos/asf/bahir/repo
Commit: http://git-wip-us.apache.org/repos/asf/bahir/commit/619936d3
Tree: http://git-wip-us.apache.org/repos/asf/bahir/tree/619936d3
Diff: http://git-wip-us.apache.org/repos/asf/bahir/diff/619936d3

Branch: refs/heads/master
Commit: 619936d39d18b7af45b7acec9af02b599a43b056
Parents: 858ad27
Author: Luciano Resende <lr...@apache.org>
Authored: Mon Aug 1 19:18:35 2016 +0300
Committer: Luciano Resende <lr...@apache.org>
Committed: Wed Aug 3 22:11:30 2016 +0300

----------------------------------------------------------------------
 streaming-mqtt/README.md | 54 +++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 54 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/bahir/blob/619936d3/streaming-mqtt/README.md
----------------------------------------------------------------------
diff --git a/streaming-mqtt/README.md b/streaming-mqtt/README.md
new file mode 100644
index 0000000..9687dfe
--- /dev/null
+++ b/streaming-mqtt/README.md
@@ -0,0 +1,54 @@
+
+[MQTT](http://mqtt.org/) is MQTT is a machine-to-machine (M2M)/"Internet of Things" connectivity protocol. It was designed as an extremely lightweight publish/subscribe messaging transport. It is useful for connections with remote locations where a small code footprint is required and/or network bandwidth is at a premium. 
+
+## Linking
+
+Using SBT:
+
+```
+libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.0.0"
+```
+
+Using Maven:
+
+```xml
+<dependency>
+    <groupId>org.apache.bahir</groupId>
+    <artifactId>spark-streaming-mqtt_2.11</artifactId>
+    <version>2.0.0</version>
+</dependency>
+```
+
+This library can also be added to Spark jobs launched through `spark-shell` or `spark-submit` by using the `--packages` command line option.
+For example, to include it when starting the spark shell:
+
+```
+$ bin/spark-shell --packages org.apache.bahir:spark-streaming_mqtt_2.11:2.0.0
+```
+
+Unlike using `--jars`, using `--packages` ensures that this library and its dependencies will be added to the classpath.
+The `--packages` argument can also be used with `bin/spark-submit`.
+
+This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.
+
+## Examples
+
+### Scala API
+
+You need to extend `ActorReceiver` so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
+this actor can be configured to handle failures, etc.
+
+```Scala
+val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
+```
+
+### Java API
+
+You need to extend `JavaActorReceiver` so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
+this actor can be configured to handle failures, etc.
+
+```Java
+JavaDStream<String> lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
+```
+
+See end-to-end examples at ([MQTT Examples](https://github.com/apache/bahir/tree/master/streaming-mqtt/examples)
\ No newline at end of file