You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@bahir.apache.org by lr...@apache.org on 2018/11/30 14:29:59 UTC

bahir-website git commit: Publishing from 5b4ed3c1f453320d68d5750deffe71b285a00bdb

Repository: bahir-website
Updated Branches:
  refs/heads/asf-site c8b0d7a6f -> f7e05e112


Publishing from 5b4ed3c1f453320d68d5750deffe71b285a00bdb


Project: http://git-wip-us.apache.org/repos/asf/bahir-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/bahir-website/commit/f7e05e11
Tree: http://git-wip-us.apache.org/repos/asf/bahir-website/tree/f7e05e11
Diff: http://git-wip-us.apache.org/repos/asf/bahir-website/diff/f7e05e11

Branch: refs/heads/asf-site
Commit: f7e05e112fb139bc993f70de46a3c5183e1facbf
Parents: c8b0d7a
Author: Luciano Resende <lr...@apache.org>
Authored: Fri Nov 30 15:29:43 2018 +0100
Committer: Luciano Resende <lr...@apache.org>
Committed: Fri Nov 30 15:29:43 2018 +0100

----------------------------------------------------------------------
 .../docs/spark/current/documentation/index.html |   6 +-
 .../spark/current/spark-sql-cloudant/index.html |   4 +-
 .../current/spark-sql-streaming-akka/index.html |   2 +-
 .../current/spark-sql-streaming-mqtt/index.html | 114 ++++++-
 .../current/spark-streaming-pubnub/index.html   | 338 +++++++++++++++++++
 .../current/spark-streaming-pubsub/index.html   |   6 +-
 .../current/spark-streaming-zeromq/index.html   |  21 +-
 content/feed.xml                                |  14 +-
 content/index.html                              |   5 +-
 .../news/2015/11/08/new-committers/index.html   |  11 -
 .../news/2015/11/09/new-committers/index.html   |  11 +
 11 files changed, 488 insertions(+), 44 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/docs/spark/current/documentation/index.html
----------------------------------------------------------------------
diff --git a/content/docs/spark/current/documentation/index.html b/content/docs/spark/current/documentation/index.html
index a470efa..1a22f35 100644
--- a/content/docs/spark/current/documentation/index.html
+++ b/content/docs/spark/current/documentation/index.html
@@ -209,7 +209,7 @@
 
 <p><a href="../spark-sql-streaming-akka">Akka data source</a></p>
 
-<p><a href="../spark-sql-streaming-mqtt">MQTT data source</a></p>
+<p><a href="../spark-sql-streaming-mqtt">MQTT data source</a> <img src="/assets/themes/apache-clean/img/new-black.png" alt="" height="36px" width="36px" /> (new Sink)</p>
 
 <p><br /></p>
 
@@ -221,11 +221,13 @@
 
 <p><a href="../spark-streaming-pubsub">Google Cloud Pub/Sub connector</a></p>
 
+<p><a href="../spark-streaming-pubnub">Cloud PubNub connector</a> <img src="/assets/themes/apache-clean/img/new-black.png" alt="" height="36px" width="36px" /></p>
+
 <p><a href="../spark-streaming-mqtt">MQTT connector</a></p>
 
 <p><a href="../spark-streaming-twitter">Twitter connector</a></p>
 
-<p><a href="../spark-streaming-zeromq">ZeroMQ connector</a></p>
+<p><a href="../spark-streaming-zeromq">ZeroMQ connector</a> <img src="/assets/themes/apache-clean/img/new-black.png" alt="" height="36px" width="36px" /> (Enhanced Implementation)</p>
 
   </div>
 </div>

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/docs/spark/current/spark-sql-cloudant/index.html
----------------------------------------------------------------------
diff --git a/content/docs/spark/current/spark-sql-cloudant/index.html b/content/docs/spark/current/spark-sql-cloudant/index.html
index 023bfad..78bb109 100644
--- a/content/docs/spark/current/spark-sql-cloudant/index.html
+++ b/content/docs/spark/current/spark-sql-cloudant/index.html
@@ -234,13 +234,13 @@ The <code class="highlighter-rouge">--packages</code> argument can also be used
 
 <p>Submit a job in Python:</p>
 
-<div class="highlighter-rouge"><pre class="highlight"><code>spark-submit  --master local[4] --packages org.apache.bahir:spark-sql-cloudant_2.11:2.3.0-SNAPSHOT  &lt;path to python script&gt;
+<div class="highlighter-rouge"><pre class="highlight"><code>spark-submit  --master local[4] --packages org.apache.bahir:spark-sql-cloudant__2.11:2.3.0-SNAPSHOT  &lt;path to python script&gt;
 </code></pre>
 </div>
 
 <p>Submit a job in Scala:</p>
 
-<div class="highlighter-rouge"><pre class="highlight"><code>spark-submit --class "&lt;your class&gt;" --master local[4] --packages org.apache.bahir:spark-sql-cloudant_2.11:2.3.0-SNAPSHOT &lt;path to spark-sql-cloudant jar&gt;
+<div class="highlighter-rouge"><pre class="highlight"><code>spark-submit --class "&lt;your class&gt;" --master local[4] --packages org.apache.bahir:spark-sql-cloudant__2.11:2.3.0-SNAPSHOT &lt;path to spark-sql-cloudant jar&gt;
 </code></pre>
 </div>
 

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/docs/spark/current/spark-sql-streaming-akka/index.html
----------------------------------------------------------------------
diff --git a/content/docs/spark/current/spark-sql-streaming-akka/index.html b/content/docs/spark/current/spark-sql-streaming-akka/index.html
index d11bdfb..f04ce0c 100644
--- a/content/docs/spark/current/spark-sql-streaming-akka/index.html
+++ b/content/docs/spark/current/spark-sql-streaming-akka/index.html
@@ -252,7 +252,7 @@ The <code class="highlighter-rouge">--packages</code> argument can also be used
 
 <h2 id="configuration-options">Configuration options.</h2>
 
-<p>This source uses <a href="http://doc.akka.io/api/akka/2.4/akka/actor/Actor.html">Akka Actor api</a>.</p>
+<p>This source uses <a href="http://doc.akka.io/api/akka/2.5/akka/actor/Actor.html">Akka Actor api</a>.</p>
 
 <ul>
   <li><code class="highlighter-rouge">urlOfPublisher</code> The url of Publisher or Feeder actor that the Receiver actor connects to. Set this as the tcp url of the Publisher or Feeder actor.</li>

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/docs/spark/current/spark-sql-streaming-mqtt/index.html
----------------------------------------------------------------------
diff --git a/content/docs/spark/current/spark-sql-streaming-mqtt/index.html b/content/docs/spark/current/spark-sql-streaming-mqtt/index.html
index 5d2f547..bb7fa85 100644
--- a/content/docs/spark/current/spark-sql-streaming-mqtt/index.html
+++ b/content/docs/spark/current/spark-sql-streaming-mqtt/index.html
@@ -195,7 +195,7 @@
 
 -->
 
-<p>A library for reading data from MQTT Servers using Spark SQL Streaming ( or Structured streaming.).</p>
+<p>A library for writing and reading data from MQTT Servers using Spark SQL Streaming (or Structured streaming).</p>
 
 <h2 id="linking">Linking</h2>
 
@@ -229,7 +229,7 @@ The <code class="highlighter-rouge">--packages</code> argument can also be used
 
 <h2 id="examples">Examples</h2>
 
-<p>A SQL Stream can be created with data streams received through MQTT Server using,</p>
+<p>SQL Stream can be created with data streams received through MQTT Server using:</p>
 
 <div class="highlighter-rouge"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
@@ -238,9 +238,20 @@ The <code class="highlighter-rouge">--packages</code> argument can also be used
 </code></pre>
 </div>
 
-<h2 id="enable-recovering-from-failures">Enable recovering from failures.</h2>
+<p>SQL Stream may be also transferred into MQTT messages using:</p>
 
-<p>Setting values for option <code class="highlighter-rouge">localStorage</code> and <code class="highlighter-rouge">clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
+<div class="highlighter-rouge"><pre class="highlight"><code>sqlContext.writeStream
+    .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSinkProvider")
+    .option("checkpointLocation", "/path/to/localdir")
+    .outputMode("complete")
+    .option("topic", "mytopic")
+    .load("tcp://localhost:1883")
+</code></pre>
+</div>
+
+<h2 id="source-recovering-from-failures">Source recovering from failures</h2>
+
+<p>Setting values for option <code class="highlighter-rouge">localStorage</code> and <code class="highlighter-rouge">clientId</code> helps in recovering in case of source restart, by restoring the state where it left off before the shutdown.</p>
 
 <div class="highlighter-rouge"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
@@ -251,15 +262,15 @@ The <code class="highlighter-rouge">--packages</code> argument can also be used
 </code></pre>
 </div>
 
-<h2 id="configuration-options">Configuration options.</h2>
+<h2 id="configuration-options">Configuration options</h2>
 
-<p>This source uses <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
+<p>This connector uses <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code class="highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this or <code class="highlighter-rouge">path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="highlighter-rouge">brokerUrl</code> An URL MqttClient connects to. Set this or <code class="highlighter-rouge">path</code> as the URL of the Mqtt Server. e.g. tcp://localhost:1883.</li>
   <li><code class="highlighter-rouge">persistence</code> By default it is used for storing incoming messages on disk. If <code class="highlighter-rouge">memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
   <li><code class="highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
-  <li><code class="highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="highlighter-rouge">clientId</code> clientId, this client is associated with. Provide the same value to recover a stopped source client. MQTT sink ignores client identifier, because Spark batch can be distributed across multiple workers whereas MQTT broker does not allow simultanous connections with same ID from multiple hosts.</li>
   <li><code class="highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
   <li><code class="highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
   <li><code class="highlighter-rouge">password</code> Sets the password to use for the connection.</li>
@@ -267,6 +278,20 @@ The <code class="highlighter-rouge">--packages</code> argument can also be used
   <li><code class="highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code class="highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
   <li><code class="highlighter-rouge">keepAlive</code> Same as <code class="highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
   <li><code class="highlighter-rouge">mqttVersion</code> Same as <code class="highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="highlighter-rouge">maxInflight</code> Same as <code class="highlighter-rouge">MqttConnectOptions.setMaxInflight</code></li>
+  <li><code class="highlighter-rouge">autoReconnect</code> Same as <code class="highlighter-rouge">MqttConnectOptions.setAutomaticReconnect</code></li>
+</ul>
+
+<h2 id="environment-variables">Environment variables</h2>
+
+<p>Custom environment variables allowing to manage MQTT connectivity performed by sink connector:</p>
+
+<ul>
+  <li><code class="highlighter-rouge">spark.mqtt.client.connect.attempts</code> Number of attempts sink will try to connect to MQTT broker before failing.</li>
+  <li><code class="highlighter-rouge">spark.mqtt.client.connect.backoff</code> Delay in milliseconds to wait before retrying connection to the server.</li>
+  <li><code class="highlighter-rouge">spark.mqtt.connection.cache.timeout</code> Sink connector caches MQTT connections. Idle connections will be closed after timeout milliseconds.</li>
+  <li><code class="highlighter-rouge">spark.mqtt.client.publish.attempts</code> Number of attempts to publish the message before failing the task.</li>
+  <li><code class="highlighter-rouge">spark.mqtt.client.publish.backoff</code> Delay in milliseconds to wait before retrying send operation.</li>
 </ul>
 
 <h3 id="scala-api">Scala API</h3>
@@ -277,7 +302,7 @@ The <code class="highlighter-rouge">--packages</code> argument can also be used
 val lines = spark.readStream
   .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
   .option("topic", topic)
-  .load(brokerUrl).as[(String, Timestamp)]
+  .load(brokerUrl).selectExpr("CAST(payload AS STRING)").as[String]
 
 // Split the lines into words
 val words = lines.map(_._1).flatMap(_.split(" "))
@@ -295,7 +320,7 @@ query.awaitTermination()
 </code></pre>
 </div>
 
-<p>Please see <code class="highlighter-rouge">MQTTStreamWordCount.scala</code> for full example.</p>
+<p>Please see <code class="highlighter-rouge">MQTTStreamWordCount.scala</code> for full example. Review <code class="highlighter-rouge">MQTTSinkWordCount.scala</code>, if interested in publishing data to MQTT broker.</p>
 
 <h3 id="java-api">Java API</h3>
 
@@ -306,7 +331,8 @@ Dataset&lt;String&gt; lines = spark
         .readStream()
         .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
         .option("topic", topic)
-        .load(brokerUrl).select("value").as(Encoders.STRING());
+        .load(brokerUrl)
+        .selectExpr("CAST(payload AS STRING)").as(Encoders.STRING());
 
 // Split the lines into words
 Dataset&lt;String&gt; words = lines.flatMap(new FlatMapFunction&lt;String, String&gt;() {
@@ -329,7 +355,71 @@ query.awaitTermination();
 </code></pre>
 </div>
 
-<p>Please see <code class="highlighter-rouge">JavaMQTTStreamWordCount.java</code> for full example.</p>
+<p>Please see <code class="highlighter-rouge">JavaMQTTStreamWordCount.java</code> for full example. Review <code class="highlighter-rouge">JavaMQTTSinkWordCount.java</code>, if interested in publishing data to MQTT broker.</p>
+
+<h2 id="best-practices">Best Practices.</h2>
+
+<ol>
+  <li>Turn Mqtt into a more reliable messaging service.</li>
+</ol>
+
+<blockquote>
+  <p><em>MQTT is a machine-to-machine (M2M)/”Internet of Things” connectivity protocol. It was designed as an extremely lightweight publish/subscribe messaging transport.</em></p>
+</blockquote>
+
+<p>The design of Mqtt and the purpose it serves goes well together, but often in an application it is of utmost value to have reliability. Since mqtt is not a distributed message queue and thus does not offer the highest level of reliability features. It should be redirected via a kafka message queue to take advantage of a distributed message queue. In fact, using a kafka message queue offers a lot of possibilities including a single kafka topic subscribed to several mqtt sources and even a single mqtt stream publishing to multiple kafka topics. Kafka is a reliable and scalable message queue.</p>
+
+<ol>
+  <li>Often the message payload is not of the default character encoding or contains binary that needs to be parsed using a particular parser. In such cases, spark mqtt payload should be processed using the external parser. For example:</li>
+</ol>
+
+<ul>
+  <li>
+    <p>Scala API example:
+<code class="highlighter-rouge">scala
+ // Create DataFrame representing the stream of binary messages
+ val lines = spark.readStream
+   .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
+   .option("topic", topic)
+   .load(brokerUrl).select("payload").as[Array[Byte]].map(externalParser(_))
+</code></p>
+  </li>
+  <li>
+    <p>Java API example
+```java
+     // Create DataFrame representing the stream of binary messages
+     Dataset&lt;byte[]&gt; lines = spark
+             .readStream()
+             .format(“org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider”)
+             .option(“topic”, topic)
+             .load(brokerUrl).selectExpr(“CAST(payload AS BINARY)”).as(Encoders.BINARY());</p>
+
+    <div class="highlighter-rouge"><pre class="highlight"><code> // Split the lines into words
+ Dataset&lt;String&gt; words = lines.map(new MapFunction&lt;byte[], String&gt;() {
+     @Override
+     public String call(byte[] bytes) throws Exception {
+         return new String(bytes); // Plug in external parser here.
+     }
+ }, Encoders.STRING()).flatMap(new FlatMapFunction&lt;String, String&gt;() {
+     @Override
+     public Iterator&lt;String&gt; call(String x) {
+         return Arrays.asList(x.split(" ")).iterator();
+     }
+ }, Encoders.STRING());
+</code></pre>
+    </div>
+  </li>
+</ul>
+
+<p>```</p>
+
+<ol>
+  <li>What is the solution for a situation when there are a large number of varied mqtt sources, each with different schema and throughput characteristics.</li>
+</ol>
+
+<p>Generally, one would create a lot of streaming pipelines to solve this problem. This would either require a very sophisticated scheduling setup or will waste a lot of resources, as it is not certain which stream is using more amount of data.</p>
+
+<p>The general solution is both less optimum and is more cumbersome to operate, with multiple moving parts incurs a high maintenance overall. As an alternative, in this situation, one can setup a single topic kafka-spark stream, where message from each of the varied stream contains a unique tag separating one from other streams. This way at the processing end, one can distinguish the message from one another and apply the right kind of decoding and processing. Similarly while storing, each message can be distinguished from others by a tag that distinguishes.</p>
 
 
   </div>

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/docs/spark/current/spark-streaming-pubnub/index.html
----------------------------------------------------------------------
diff --git a/content/docs/spark/current/spark-streaming-pubnub/index.html b/content/docs/spark/current/spark-streaming-pubnub/index.html
new file mode 100644
index 0000000..37083fd
--- /dev/null
+++ b/content/docs/spark/current/spark-streaming-pubnub/index.html
@@ -0,0 +1,338 @@
+
+
+<!DOCTYPE html>
+<html lang="en">
+  <head>
+    <meta charset="utf-8">
+    <title>Spark Streaming Google Pub-Sub</title>
+    <meta name="description" content="Spark Streaming Google Pub-Sub">
+    <meta name="author" content="">
+
+    <!-- Enable responsive viewport -->
+    <meta name="viewport" content="width=device-width, initial-scale=1.0">
+
+    <!-- Le HTML5 shim, for IE6-8 support of HTML elements -->
+    <!--[if lt IE 9]>
+      <script src="http://html5shim.googlecode.com/svn/trunk/html5.js"></script>
+    <![endif]-->
+
+    <!-- Le styles -->
+    <link href="/assets/themes/apache-clean/bootstrap/css/bootstrap.css" rel="stylesheet">
+    <link href="/assets/themes/apache-clean/css/style.css?body=1" rel="stylesheet" type="text/css">
+    <link href="/assets/themes/apache-clean/css/syntax.css" rel="stylesheet"  type="text/css" media="screen" />
+    <!-- Le fav and touch icons -->
+    <!-- Update these with your own images
+    <link rel="shortcut icon" href="images/favicon.ico">
+    <link rel="apple-touch-icon" href="images/apple-touch-icon.png">
+    <link rel="apple-touch-icon" sizes="72x72" href="images/apple-touch-icon-72x72.png">
+    <link rel="apple-touch-icon" sizes="114x114" href="images/apple-touch-icon-114x114.png">
+  -->
+
+    <!-- make tables sortable by adding class tag "sortable" to table elements -->
+    <script src="http://www.kryogenix.org/code/browser/sorttable/sorttable.js"></script>
+
+
+  </head>
+
+  <body>
+
+    
+
+<!-- Navigation -->
+<div id="nav-bar">
+  <nav id="nav-container" class="navbar navbar-inverse " role="navigation">
+    <div class="container">
+      <!-- Brand and toggle get grouped for better mobile display -->
+
+      <div class="navbar-header page-scroll">
+        <button type="button" class="navbar-toggle" data-toggle="collapse" data-target=".navbar-collapse">
+          <span class="sr-only">Toggle navigation</span>
+          <span class="icon-bar"></span>
+          <span class="icon-bar"></span>
+          <span class="icon-bar"></span>
+        </button>
+        <a class="navbar-brand page-scroll" href="/#home">Home</a>
+      </div>
+      <!-- Collect the nav links, forms, and other content for toggling -->
+      <nav class="navbar-collapse collapse" role="navigation">
+        <ul class="nav navbar-nav">
+          
+          
+          
+          <li id="download">
+            
+            <a href="#" data-toggle="dropdown" class="dropdown-toggle">Download<b class="caret"></b></a>
+            <ul class="dropdown-menu dropdown-left">
+              
+              
+              <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
+              
+              
+              <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
+              
+            </ul>
+            
+          </li>
+          
+          
+          
+          
+          <li id="community">
+            
+            <a href="#" data-toggle="dropdown" class="dropdown-toggle">Community<b class="caret"></b></a>
+            <ul class="dropdown-menu dropdown-left">
+              
+              
+              <li><a href="/community" target="_self">Get Involved</a></li>
+              
+              
+              <li><a href="/contributing" target="_self">Contributing</a></li>
+              
+              
+              <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
+              
+              
+              <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
+              
+              
+              <li><a href="/community#source-code" target="_self">Source Code</a></li>
+              
+              
+              <li><a href="/community-members" target="_self">Project Committers</a></li>
+              
+            </ul>
+            
+          </li>
+          
+          
+          
+          
+          <li id="documentation">
+            
+            <a href="#" data-toggle="dropdown" class="dropdown-toggle">Documentation<b class="caret"></b></a>
+            <ul class="dropdown-menu dropdown-left">
+              
+              
+              <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
+              
+              
+              <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
+              
+            </ul>
+            
+          </li>
+          
+          
+          
+          
+          <li id="github">
+            
+            <a href="#" data-toggle="dropdown" class="dropdown-toggle">GitHub<b class="caret"></b></a>
+            <ul class="dropdown-menu dropdown-left">
+              
+              
+              <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
+              
+              
+              <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
+              
+              
+              <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
+              
+            </ul>
+            
+          </li>
+          
+          
+          
+          
+          <li id="apache">
+            
+            <a href="#" data-toggle="dropdown" class="dropdown-toggle">Apache<b class="caret"></b></a>
+            <ul class="dropdown-menu dropdown-left">
+              
+              
+              <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
+              
+              
+              <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
+              
+              
+              <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
+              
+              
+              <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
+              
+              
+              <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
+              
+            </ul>
+            
+          </li>
+          
+          
+        </ul>
+      </nav><!--/.navbar-collapse -->
+      <!-- /.navbar-collapse -->
+    </div>
+    <!-- /.container -->
+  </nav>
+</div>
+
+
+    <div class="container">
+
+      
+
+<!--<div class="hero-unit Spark Streaming Google Pub-Sub">
+  <h1></h1>
+</div>
+-->
+
+<div class="row">
+  <div class="col-md-12">
+    <!--
+
+-->
+
+<h1 id="spark-streaming-pubnub-connector">Spark Streaming PubNub Connector</h1>
+
+<p>Library for reading data from real-time messaging infrastructure <a href="https://www.pubnub.com/">PubNub</a> using Spark Streaming.</p>
+
+<h2 id="linking">Linking</h2>
+
+<p>Using SBT:</p>
+
+<div class="highlighter-rouge"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-pubnub" % "2.3.0-SNAPSHOT"
+</code></pre>
+</div>
+
+<p>Using Maven:</p>
+
+<div class="highlighter-rouge"><pre class="highlight"><code>&lt;dependency&gt;
+    &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
+    &lt;artifactId&gt;spark-streaming-pubnub_2.11&lt;/artifactId&gt;
+    &lt;version&gt;2.3.0-SNAPSHOT&lt;/version&gt;
+&lt;/dependency&gt;
+</code></pre>
+</div>
+
+<p>This library can also be added to Spark jobs launched through <code class="highlighter-rouge">spark-shell</code> or <code class="highlighter-rouge">spark-submit</code> by using the <code class="highlighter-rouge">--packages</code> command line option.
+For example, to include it when starting the spark shell:</p>
+
+<div class="highlighter-rouge"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-pubnub_2.11:2.3.0-SNAPSHOT
+</code></pre>
+</div>
+
+<p>Unlike using <code class="highlighter-rouge">--jars</code>, using <code class="highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="highlighter-rouge">--packages</code> argument can also be used with <code class="highlighter-rouge">bin/spark-submit</code>.</p>
+
+<h2 id="examples">Examples</h2>
+
+<p>Connector leverages official Java client for PubNub cloud infrastructure. You can import the <code class="highlighter-rouge">PubNubUtils</code>
+class and create input stream by calling <code class="highlighter-rouge">PubNubUtils.createStream()</code> as shown below. Security and performance related
+features shall be setup inside standard <code class="highlighter-rouge">PNConfiguration</code> object. We advise to configure reconnection policy so that
+temporary network outages do not interrupt processing job. Users may subscribe to multiple channels and channel groups,
+as well as specify time token to start receiving messages since given point in time.</p>
+
+<p>For complete code examples, please review <em>examples</em> directory.</p>
+
+<h3 id="scala-api">Scala API</h3>
+
+<div class="highlighter-rouge"><pre class="highlight"><code>import com.pubnub.api.PNConfiguration
+import com.pubnub.api.enums.PNReconnectionPolicy
+
+import org.apache.spark.streaming.pubnub.{PubNubUtils, SparkPubNubMessage}
+
+val config = new PNConfiguration
+config.setSubscribeKey(subscribeKey)
+config.setSecure(true)
+config.setReconnectionPolicy(PNReconnectionPolicy.LINEAR)
+val channel = "my-channel"
+
+val pubNubStream: ReceiverInputDStream[SparkPubNubMessage] = PubNubUtils.createStream(
+  ssc, config, Seq(channel), Seq(), None, StorageLevel.MEMORY_AND_DISK_SER_2
+)
+</code></pre>
+</div>
+
+<h3 id="java-api">Java API</h3>
+
+<div class="highlighter-rouge"><pre class="highlight"><code>import com.pubnub.api.PNConfiguration
+import com.pubnub.api.enums.PNReconnectionPolicy
+
+import org.apache.spark.streaming.pubnub.PubNubUtils
+import org.apache.spark.streaming.pubnub.SparkPubNubMessage
+
+PNConfiguration config = new PNConfiguration()
+config.setSubscribeKey(subscribeKey)
+config.setSecure(true)
+config.setReconnectionPolicy(PNReconnectionPolicy.LINEAR)
+Set&lt;String&gt; channels = new HashSet&lt;String&gt;() ;
+
+ReceiverInputDStream&lt;SparkPubNubMessage&gt; pubNubStream = PubNubUtils.createStream(
+  ssc, config, channels, Collections.EMPTY_SET, null,
+  StorageLevel.MEMORY_AND_DISK_SER_2()
+)
+</code></pre>
+</div>
+
+<h2 id="unit-test">Unit Test</h2>
+
+<p>Unit tests take advantage of publicly available <em>demo</em> subscription and and publish key, which has limited request rate.</p>
+
+  </div>
+</div>
+
+
+
+      <hr>
+
+      <!-- <p>&copy; 2018 </p>-->
+      <footer class="site-footer">
+    <div class="wrapper">
+        <div class="footer-col-wrapper">
+            
+            <div style="text-align:center;">
+                
+                <div>
+                    Copyright &copy; 2016-2017 <a href="http://www.apache.org">The Apache Software Foundation</a>.
+                    Licensed under the <a href="http://www.apache.org/licenses/LICENSE-2.0">Apache License, Version
+                    2.0</a>.
+                    <br>
+                    
+                    Apache and the Apache Feather logo are trademarks of The Apache Software Foundation.
+                    
+                </div>
+            </div>
+        </div>
+    </div>
+</footer>
+
+    </div>
+
+    
+
+
+  <script type="text/javascript">
+  (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
+  (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
+  m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
+  })(window,document,'script','//www.google-analytics.com/analytics.js','ga');
+
+  ga('create', 'UA-79140859-1', 'bahir.apache.org');
+  ga('require', 'linkid', 'linkid.js');
+  ga('send', 'pageview');
+
+</script>
+
+
+
+    <script src="/assets/themes/apache-clean/jquery/jquery-2.1.1.min.js"></script>
+
+    <script src="/assets/themes/apache-clean/bootstrap/js/bootstrap.min.js"></script>
+
+
+  </body>
+</html>
+

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/docs/spark/current/spark-streaming-pubsub/index.html
----------------------------------------------------------------------
diff --git a/content/docs/spark/current/spark-streaming-pubsub/index.html b/content/docs/spark/current/spark-streaming-pubsub/index.html
index a7088db..71667a8 100644
--- a/content/docs/spark/current/spark-streaming-pubsub/index.html
+++ b/content/docs/spark/current/spark-streaming-pubsub/index.html
@@ -4,8 +4,8 @@
 <html lang="en">
   <head>
     <meta charset="utf-8">
-    <title>Spark Streaming Google Pub-Sub</title>
-    <meta name="description" content="Spark Streaming Google Pub-Sub">
+    <title>Spark Streaming PubNub</title>
+    <meta name="description" content="Spark Streaming PubNub">
     <meta name="author" content="">
 
     <!-- Enable responsive viewport -->
@@ -184,7 +184,7 @@
 
       
 
-<!--<div class="hero-unit Spark Streaming Google Pub-Sub">
+<!--<div class="hero-unit Spark Streaming PubNub">
   <h1></h1>
 </div>
 -->

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/docs/spark/current/spark-streaming-zeromq/index.html
----------------------------------------------------------------------
diff --git a/content/docs/spark/current/spark-streaming-zeromq/index.html b/content/docs/spark/current/spark-streaming-zeromq/index.html
index b5638ab..d5215cc 100644
--- a/content/docs/spark/current/spark-streaming-zeromq/index.html
+++ b/content/docs/spark/current/spark-streaming-zeromq/index.html
@@ -195,6 +195,8 @@
 
 -->
 
+<h1 id="spark-streaming-zeromq-connector">Spark Streaming ZeroMQ Connector</h1>
+
 <p>A library for reading data from <a href="http://zeromq.org/">ZeroMQ</a> using Spark Streaming.</p>
 
 <h2 id="linking">Linking</h2>
@@ -229,20 +231,31 @@ The <code class="highlighter-rouge">--packages</code> argument can also be used
 
 <h2 id="examples">Examples</h2>
 
+<p>Review end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-zeromq/examples">ZeroMQ Examples</a>.</p>
+
 <h3 id="scala-api">Scala API</h3>
 
-<div class="highlighter-rouge"><pre class="highlight"><code>val lines = ZeroMQUtils.createStream(ssc, ...)
+<div class="highlighter-rouge"><pre class="highlight"><code>import org.apache.spark.streaming.zeromq.ZeroMQUtils
+
+val lines = ZeroMQUtils.createTextStream(
+  ssc, "tcp://server:5555", true, Seq("my-topic".getBytes)
+)
 </code></pre>
 </div>
 
 <h3 id="java-api">Java API</h3>
 
-<div class="highlighter-rouge"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
+<div class="highlighter-rouge"><pre class="highlight"><code>import org.apache.spark.storage.StorageLevel;
+import org.apache.spark.streaming.api.java.JavaReceiverInputDStream;
+import org.apache.spark.streaming.zeromq.ZeroMQUtils;
+
+JavaReceiverInputDStream&lt;String&gt; test1 = ZeroMQUtils.createJavaStream(
+    ssc, "tcp://server:5555", true, Arrays.asList("my-topic.getBytes()),
+    StorageLevel.MEMORY_AND_DISK_SER_2()
+);
 </code></pre>
 </div>
 
-<p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-zeromq/examples">ZeroMQ Examples</a></p>
-
   </div>
 </div>
 

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/feed.xml
----------------------------------------------------------------------
diff --git a/content/feed.xml b/content/feed.xml
index 6bc788c..cd7f340 100644
--- a/content/feed.xml
+++ b/content/feed.xml
@@ -6,8 +6,8 @@
 </description>
     <link>http://bahir.apache.org/</link>
     <atom:link href="http://bahir.apache.org/feed.xml" rel="self" type="application/rss+xml"/>
-    <pubDate>Tue, 13 Nov 2018 12:56:20 -0800</pubDate>
-    <lastBuildDate>Tue, 13 Nov 2018 12:56:20 -0800</lastBuildDate>
+    <pubDate>Fri, 30 Nov 2018 15:29:37 +0100</pubDate>
+    <lastBuildDate>Fri, 30 Nov 2018 15:29:37 +0100</lastBuildDate>
     <generator>Jekyll v3.2.1</generator>
     
       <item>
@@ -23,7 +23,7 @@ it.&lt;/p&gt;
 this release includes faster-than-light travel and chewing gum that
 never loses its flavor.&lt;/p&gt;
 </description>
-        <pubDate>Tue, 10 Nov 2015 04:00:00 -0800</pubDate>
+        <pubDate>Tue, 10 Nov 2015 13:00:00 +0100</pubDate>
         <link>http://bahir.apache.org/news/2015/11/10/release-0.2.0/</link>
         <guid isPermaLink="true">http://bahir.apache.org/news/2015/11/10/release-0.2.0/</guid>
         
@@ -48,9 +48,9 @@ committers for their work on the project. Welcome!&lt;/p&gt;
   &lt;li&gt;Princess Leia&lt;/li&gt;
 &lt;/ul&gt;
 </description>
-        <pubDate>Sun, 08 Nov 2015 19:03:07 -0800</pubDate>
-        <link>http://bahir.apache.org/news/2015/11/08/new-committers/</link>
-        <guid isPermaLink="true">http://bahir.apache.org/news/2015/11/08/new-committers/</guid>
+        <pubDate>Mon, 09 Nov 2015 04:03:07 +0100</pubDate>
+        <link>http://bahir.apache.org/news/2015/11/09/new-committers/</link>
+        <guid isPermaLink="true">http://bahir.apache.org/news/2015/11/09/new-committers/</guid>
         
         
         <category>team</category>
@@ -67,7 +67,7 @@ committers for their work on the project. Welcome!&lt;/p&gt;
 
 &lt;p&gt;We’re so pleased to be in the Apache Incbuator.&lt;/p&gt;
 </description>
-        <pubDate>Fri, 25 Sep 2015 05:00:00 -0700</pubDate>
+        <pubDate>Fri, 25 Sep 2015 14:00:00 +0200</pubDate>
         <link>http://bahir.apache.org/news/2015/09/25/release-0.1.0/</link>
         <guid isPermaLink="true">http://bahir.apache.org/news/2015/09/25/release-0.1.0/</guid>
         

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/index.html
----------------------------------------------------------------------
diff --git a/content/index.html b/content/index.html
index dc51e9e..710896f 100644
--- a/content/index.html
+++ b/content/index.html
@@ -215,9 +215,10 @@
   <li>Spark DStream connector for Apache CouchDB/Cloudant</li>
   <li>Spark DStream connector for Akka</li>
   <li>Spark DStream connector for Google Cloud Pub/Sub</li>
-  <li>Spark DStream connector for MQTT</li>
+  <li>Spark DStream connector for Google PubNub <img src="/assets/themes/apache-clean/img/new-black.png" alt="" height="36px" width="36px" /></li>
+  <li>Spark DStream connector for MQTT <img src="/assets/themes/apache-clean/img/new-black.png" alt="" height="36px" width="36px" /> (new Sink)</li>
   <li>Spark DStream connector for Twitter</li>
-  <li>Spark DStream connector for ZeroMQ</li>
+  <li>Spark DStream connector for ZeroMQ <img src="/assets/themes/apache-clean/img/new-black.png" alt="" height="36px" width="36px" /> (Enhanced Implementation)</li>
 </ul>
 
 <h2 id="apache-flink-extensions">Apache Flink extensions</h2>

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/news/2015/11/08/new-committers/index.html
----------------------------------------------------------------------
diff --git a/content/news/2015/11/08/new-committers/index.html b/content/news/2015/11/08/new-committers/index.html
deleted file mode 100644
index f0dd77d..0000000
--- a/content/news/2015/11/08/new-committers/index.html
+++ /dev/null
@@ -1,11 +0,0 @@
-<!--
-
--->
-
-<p>The Bahir project management committee today added two new
-committers for their work on the project. Welcome!</p>
-
-<ul>
-  <li>Darth Vader</li>
-  <li>Princess Leia</li>
-</ul>

http://git-wip-us.apache.org/repos/asf/bahir-website/blob/f7e05e11/content/news/2015/11/09/new-committers/index.html
----------------------------------------------------------------------
diff --git a/content/news/2015/11/09/new-committers/index.html b/content/news/2015/11/09/new-committers/index.html
new file mode 100644
index 0000000..f0dd77d
--- /dev/null
+++ b/content/news/2015/11/09/new-committers/index.html
@@ -0,0 +1,11 @@
+<!--
+
+-->
+
+<p>The Bahir project management committee today added two new
+committers for their work on the project. Welcome!</p>
+
+<ul>
+  <li>Darth Vader</li>
+  <li>Princess Leia</li>
+</ul>