You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@bahir.apache.org by lr...@apache.org on 2020/12/15 01:42:58 UTC

[bahir-website] branch asf-site updated: Publishing from 5c4a5a3be45891df6f386fc3d817aba1a032cb00

This is an automated email from the ASF dual-hosted git repository.

lresende pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/bahir-website.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 4859339  Publishing from 5c4a5a3be45891df6f386fc3d817aba1a032cb00
4859339 is described below

commit 485933919b3ae759195ed52c10d4dff740630cca
Author: Luciano Resende <lr...@apache.org>
AuthorDate: Mon Dec 14 17:42:45 2020 -0800

    Publishing from 5c4a5a3be45891df6f386fc3d817aba1a032cb00
---
 content/community-members/index.html               |  18 -
 content/community/index.html                       |  24 +-
 content/contributing-extensions/index.html         |  18 -
 content/contributing/index.html                    |  44 +-
 content/docs/flink/1.0/documentation/index.html    |  18 -
 .../flink/1.0/flink-streaming-activemq/index.html  |  24 +-
 .../docs/flink/1.0/flink-streaming-akka/index.html |  36 +-
 .../flink/1.0/flink-streaming-flume/index.html     |  28 +-
 .../flink/1.0/flink-streaming-netty/index.html     |  64 +-
 .../flink/1.0/flink-streaming-redis/index.html     |  46 +-
 .../docs/flink/current/documentation/index.html    |  18 -
 .../current/flink-streaming-activemq/index.html    |  24 +-
 .../flink/current/flink-streaming-akka/index.html  |  36 +-
 .../flink/current/flink-streaming-flume/index.html |  28 +-
 .../current/flink-streaming-influxdb/index.html    |  26 +-
 .../flink/current/flink-streaming-kudu/index.html  | 368 ++++++++---
 .../flink/current/flink-streaming-netty/index.html |  64 +-
 .../flink/current/flink-streaming-redis/index.html |  48 +-
 content/docs/flink/overview/index.html             |  18 -
 content/docs/spark/2.0.0/documentation/index.html  |  18 -
 .../2.0.0/spark-sql-streaming-mqtt/index.html      |  58 +-
 .../spark/2.0.0/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.0.0/spark-streaming-mqtt/index.html    |  48 +-
 .../spark/2.0.0/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.0.0/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.0.1/documentation/index.html  |  18 -
 .../2.0.1/spark-sql-streaming-mqtt/index.html      |  80 +--
 .../spark/2.0.1/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.0.1/spark-streaming-mqtt/index.html    |  75 +--
 .../spark/2.0.1/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.0.1/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.0.2/documentation/index.html  |  18 -
 .../2.0.2/spark-sql-streaming-mqtt/index.html      |  80 +--
 .../spark/2.0.2/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.0.2/spark-streaming-mqtt/index.html    |  75 +--
 .../spark/2.0.2/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.0.2/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.1.0/documentation/index.html  |  18 -
 .../2.1.0/spark-sql-streaming-mqtt/index.html      |  80 +--
 .../spark/2.1.0/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.1.0/spark-streaming-mqtt/index.html    |  75 +--
 .../spark/2.1.0/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.1.0/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.1.1/documentation/index.html  |  18 -
 .../docs/spark/2.1.1/spark-sql-cloudant/index.html | 313 ++++-----
 .../2.1.1/spark-sql-streaming-akka/index.html      |  62 +-
 .../2.1.1/spark-sql-streaming-mqtt/index.html      |  80 +--
 .../spark/2.1.1/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.1.1/spark-streaming-mqtt/index.html    |  77 +--
 .../spark/2.1.1/spark-streaming-pubsub/index.html  |  64 +-
 .../spark/2.1.1/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.1.1/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.1.2/documentation/index.html  |  18 -
 .../docs/spark/2.1.2/spark-sql-cloudant/index.html | 371 +++++------
 .../2.1.2/spark-sql-streaming-akka/index.html      |  62 +-
 .../2.1.2/spark-sql-streaming-mqtt/index.html      |  80 +--
 .../spark/2.1.2/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.1.2/spark-streaming-mqtt/index.html    |  87 +--
 .../spark/2.1.2/spark-streaming-pubsub/index.html  |  85 +--
 .../spark/2.1.2/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.1.2/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.1.3/documentation/index.html  |  18 -
 .../docs/spark/2.1.3/spark-sql-cloudant/index.html | 371 +++++------
 .../2.1.3/spark-sql-streaming-akka/index.html      |  62 +-
 .../2.1.3/spark-sql-streaming-mqtt/index.html      |  80 +--
 .../spark/2.1.3/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.1.3/spark-streaming-mqtt/index.html    |  87 +--
 .../spark/2.1.3/spark-streaming-pubsub/index.html  |  85 +--
 .../spark/2.1.3/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.1.3/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.2.0/documentation/index.html  |  18 -
 .../docs/spark/2.2.0/spark-sql-cloudant/index.html | 313 ++++-----
 .../2.2.0/spark-sql-streaming-akka/index.html      |  62 +-
 .../2.2.0/spark-sql-streaming-mqtt/index.html      |  80 +--
 .../spark/2.2.0/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.2.0/spark-streaming-mqtt/index.html    |  77 +--
 .../spark/2.2.0/spark-streaming-pubsub/index.html  |  64 +-
 .../spark/2.2.0/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.2.0/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.2.1/documentation/index.html  |  18 -
 .../docs/spark/2.2.1/spark-sql-cloudant/index.html | 371 +++++------
 .../2.2.1/spark-sql-streaming-akka/index.html      |  62 +-
 .../2.2.1/spark-sql-streaming-mqtt/index.html      |  80 +--
 .../spark/2.2.1/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.2.1/spark-streaming-mqtt/index.html    |  87 +--
 .../spark/2.2.1/spark-streaming-pubsub/index.html  |  85 +--
 .../spark/2.2.1/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.2.1/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.2.2/documentation/index.html  |  18 -
 .../docs/spark/2.2.2/spark-sql-cloudant/index.html | 371 +++++------
 .../2.2.2/spark-sql-streaming-akka/index.html      |  62 +-
 .../2.2.2/spark-sql-streaming-mqtt/index.html      |  80 +--
 .../spark/2.2.2/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.2.2/spark-streaming-mqtt/index.html    |  87 +--
 .../spark/2.2.2/spark-streaming-pubsub/index.html  |  85 +--
 .../spark/2.2.2/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.2.2/spark-streaming-zeromq/index.html  |  44 +-
 .../{2.2.1 => 2.2.3}/documentation/index.html      |  18 -
 .../docs/spark/2.2.3/spark-sql-cloudant/index.html | 717 ++++++++++++++++++++
 .../spark-sql-streaming-akka/index.html            |  64 +-
 .../spark-sql-streaming-mqtt/index.html            |  82 +--
 .../spark-streaming-akka/index.html                |  52 +-
 .../spark-streaming-mqtt/index.html                |  89 +--
 .../spark-streaming-pubsub/index.html              |  87 +--
 .../spark-streaming-twitter/index.html             |  50 +-
 .../spark-streaming-zeromq/index.html              |  46 +-
 content/docs/spark/2.3.0/documentation/index.html  |  18 -
 .../docs/spark/2.3.0/spark-sql-cloudant/index.html | 371 +++++------
 .../2.3.0/spark-sql-streaming-akka/index.html      |  62 +-
 .../2.3.0/spark-sql-streaming-mqtt/index.html      | 118 ++--
 .../spark/2.3.0/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.3.0/spark-streaming-mqtt/index.html    |  87 +--
 .../spark/2.3.0/spark-streaming-pubnub/index.html  |  50 +-
 .../spark/2.3.0/spark-streaming-pubsub/index.html  |  85 +--
 .../spark/2.3.0/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.3.0/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.3.1/documentation/index.html  |  18 -
 .../docs/spark/2.3.1/spark-sql-cloudant/index.html | 371 +++++------
 .../2.3.1/spark-sql-streaming-akka/index.html      |  62 +-
 .../2.3.1/spark-sql-streaming-mqtt/index.html      | 118 ++--
 .../spark/2.3.1/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.3.1/spark-streaming-mqtt/index.html    |  87 +--
 .../spark/2.3.1/spark-streaming-pubnub/index.html  |  50 +-
 .../spark/2.3.1/spark-streaming-pubsub/index.html  |  85 +--
 .../spark/2.3.1/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.3.1/spark-streaming-zeromq/index.html  |  44 +-
 content/docs/spark/2.3.2/documentation/index.html  |  18 -
 .../docs/spark/2.3.2/spark-sql-cloudant/index.html | 371 +++++------
 .../2.3.2/spark-sql-streaming-akka/index.html      |  62 +-
 .../2.3.2/spark-sql-streaming-mqtt/index.html      | 118 ++--
 .../spark/2.3.2/spark-streaming-akka/index.html    |  50 +-
 .../spark/2.3.2/spark-streaming-mqtt/index.html    |  87 +--
 .../spark/2.3.2/spark-streaming-pubnub/index.html  |  50 +-
 .../spark/2.3.2/spark-streaming-pubsub/index.html  |  85 +--
 .../spark/2.3.2/spark-streaming-twitter/index.html |  48 +-
 .../spark/2.3.2/spark-streaming-zeromq/index.html  |  44 +-
 .../{2.3.0 => 2.3.3}/documentation/index.html      |  18 -
 .../docs/spark/2.3.3/spark-sql-cloudant/index.html | 717 ++++++++++++++++++++
 .../spark-sql-streaming-akka/index.html            |  64 +-
 .../spark-sql-streaming-mqtt/index.html            | 120 ++--
 .../spark-streaming-akka/index.html                |  52 +-
 .../spark-streaming-mqtt/index.html                |  89 +--
 .../spark-streaming-pubnub/index.html              |  52 +-
 .../spark-streaming-pubsub/index.html              |  87 +--
 .../spark-streaming-twitter/index.html             |  50 +-
 .../spark-streaming-zeromq/index.html              |  46 +-
 .../{2.3.0 => 2.3.4}/documentation/index.html      |  18 -
 .../docs/spark/2.3.4/spark-sql-cloudant/index.html | 717 ++++++++++++++++++++
 .../spark-sql-streaming-akka/index.html            |  64 +-
 .../spark-sql-streaming-mqtt/index.html            | 120 ++--
 .../spark-streaming-akka/index.html                |  52 +-
 .../spark-streaming-mqtt/index.html                |  89 +--
 .../spark-streaming-pubnub/index.html              |  52 +-
 .../spark-streaming-pubsub/index.html              |  87 +--
 .../spark-streaming-twitter/index.html             |  50 +-
 .../spark-streaming-zeromq/index.html              |  46 +-
 .../{2.3.0 => 2.4.0}/documentation/index.html      |  18 -
 .../docs/spark/2.4.0/spark-sql-cloudant/index.html | 722 +++++++++++++++++++++
 .../spark-sql-streaming-akka/index.html            |  73 +--
 .../spark-sql-streaming-mqtt/index.html            | 243 ++++---
 .../spark-streaming-akka/index.html                |  59 +-
 .../spark-streaming-mqtt/index.html                |  96 ++-
 .../spark-streaming-pubnub/index.html              |  67 +-
 .../spark-streaming-pubsub/index.html              |  94 ++-
 .../spark-streaming-twitter/index.html             |  74 ++-
 .../spark-streaming-zeromq/index.html              |  51 +-
 .../docs/spark/current/documentation/index.html    |  18 -
 .../spark/current/spark-sql-cloudant/index.html    | 390 ++++++-----
 .../current/spark-sql-streaming-akka/index.html    |  71 +-
 .../current/spark-sql-streaming-mqtt/index.html    | 241 ++++---
 .../spark/current/spark-streaming-akka/index.html  |  57 +-
 .../spark/current/spark-streaming-mqtt/index.html  |  94 ++-
 .../current/spark-streaming-pubnub/index.html      |  65 +-
 .../current/spark-streaming-pubsub/index.html      |  92 ++-
 .../current/spark-streaming-twitter/index.html     |  74 ++-
 .../current/spark-streaming-zeromq/index.html      |  49 +-
 content/docs/spark/overview/index.html             |  22 +-
 content/downloads/flink/index.html                 |  18 -
 content/downloads/spark/index.html                 |  40 +-
 content/feed.xml                                   |   6 +-
 content/history/index.html                         |  22 +-
 content/index.html                                 |  22 +-
 content/privacy-policy/index.html                  |  22 +-
 .../releases/spark/2.0.0/release-notes/index.html  |  38 +-
 .../releases/spark/2.0.1/release-notes/index.html  |  30 +-
 .../releases/spark/2.0.2/release-notes/index.html  |  30 +-
 .../releases/spark/2.1.0/release-notes/index.html  |  22 +-
 .../releases/spark/2.3.3/release-notes/index.html  |  22 +-
 .../releases/spark/2.3.4/release-notes/index.html  |  22 +-
 189 files changed, 8310 insertions(+), 8301 deletions(-)

diff --git a/content/community-members/index.html b/content/community-members/index.html
index 0283691..ea294e4 100644
--- a/content/community-members/index.html
+++ b/content/community-members/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/community/index.html b/content/community/index.html
index 2042a15..3992e44 100644
--- a/content/community/index.html
+++ b/content/community/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -233,9 +215,9 @@
 <p>Get help using Bahir or contribute to the project on our mailing lists:</p>
 
 <ul>
-  <li><a href="&#109;&#097;&#105;&#108;&#116;&#111;:&#117;&#115;&#101;&#114;&#064;&#098;&#097;&#104;&#105;&#114;&#046;&#097;&#112;&#097;&#099;&#104;&#101;&#046;&#111;&#114;&#103;">&#117;&#115;&#101;&#114;&#064;&#098;&#097;&#104;&#105;&#114;&#046;&#097;&#112;&#097;&#099;&#104;&#101;&#046;&#111;&#114;&#103;</a> is for usage questions, help, and announcements. <a href="&#109;&#097;&#105;&#108;&#116;&#111;:&#117;&#115;&#101;&#114;&#045;&#115;&#117;&#098;&#115;&#099;&#114;&#105;&#098;&#101;&# [...]
-  <li><a href="&#109;&#097;&#105;&#108;&#116;&#111;:&#100;&#101;&#118;&#064;&#098;&#097;&#104;&#105;&#114;&#046;&#097;&#112;&#097;&#099;&#104;&#101;&#046;&#111;&#114;&#103;">&#100;&#101;&#118;&#064;&#098;&#097;&#104;&#105;&#114;&#046;&#097;&#112;&#097;&#099;&#104;&#101;&#046;&#111;&#114;&#103;</a> is for people who want to contribute code to Bahir. <a href="&#109;&#097;&#105;&#108;&#116;&#111;:&#100;&#101;&#118;&#045;&#115;&#117;&#098;&#115;&#099;&#114;&#105;&#098;&#101;&#064;&#098;&#097 [...]
-  <li><a href="&#109;&#097;&#105;&#108;&#116;&#111;:&#099;&#111;&#109;&#109;&#105;&#116;&#115;&#064;&#098;&#097;&#104;&#105;&#114;&#046;&#097;&#112;&#097;&#099;&#104;&#101;&#046;&#111;&#114;&#103;">&#099;&#111;&#109;&#109;&#105;&#116;&#115;&#064;&#098;&#097;&#104;&#105;&#114;&#046;&#097;&#112;&#097;&#099;&#104;&#101;&#046;&#111;&#114;&#103;</a> is for commit messages and patches to Bahir. <a href="&#109;&#097;&#105;&#108;&#116;&#111;:&#099;&#111;&#109;&#109;&#105;&#116;&#115;&#045;&#115; [...]
+  <li><a href="mailto:user@bahir.apache.org">user@bahir.apache.org</a> is for usage questions, help, and announcements. <a href="mailto:user-subscribe@bahir.apache.org?subject=send this email to subscribe">subscribe</a>,     <a href="mailto:dev-unsubscribe@bahir.apache.org?subject=send this email to unsubscribe">unsubscribe</a>, <a href="https://www.mail-archive.com/user@bahir.apache.org/">archives</a></li>
+  <li><a href="mailto:dev@bahir.apache.org">dev@bahir.apache.org</a> is for people who want to contribute code to Bahir. <a href="mailto:dev-subscribe@bahir.apache.org?subject=send this email to subscribe">subscribe</a>, <a href="mailto:dev-unsubscribe@bahir.apache.org?subject=send this email to unsubscribe">unsubscribe</a>, <a href="https://www.mail-archive.com/dev@bahir.apache.org/">archives</a></li>
+  <li><a href="mailto:commits@bahir.apache.org">commits@bahir.apache.org</a> is for commit messages and patches to Bahir. <a href="mailto:commits-subscribe@bahir.apache.org?subject=send this email to subscribe">subscribe</a>, <a href="mailto:commits-unsubscribe@bahir.apache.org?subject=send this email to unsubscribe">unsubscribe</a>, <a href="https://www.mail-archive.com/commits@bahir.apache.org/">archives</a></li>
 </ul>
 
 <h2 id="issue-tracker">Issue tracker</h2>
diff --git a/content/contributing-extensions/index.html b/content/contributing-extensions/index.html
index 77eab9f..c11867f 100644
--- a/content/contributing-extensions/index.html
+++ b/content/contributing-extensions/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/contributing/index.html b/content/contributing/index.html
index 24e0df2..8762424 100644
--- a/content/contributing/index.html
+++ b/content/contributing/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -264,6 +246,8 @@
       <li>Provide a descriptive Title. “Update web UI” or “Problem in scheduler” is not sufficient. “Kafka Streaming support fails to handle empty queue in YARN cluster mode” is good.</li>
       <li>Write a detailed Description. For bug reports, this should ideally include a short reproduction of the problem. For new features, it may include a design document.</li>
       <li>Set required fields:</li>
+    </ul>
+    <ul>
       <li><strong>Issue Type</strong>. Generally, Bug, Improvement and New Feature are the only types used in Bahir.</li>
       <li><strong>Priority</strong>. Set to Major or below; higher priorities are generally reserved for committers to set. JIRA tends to unfortunately conflate “size” and “importance” in its Priority field values. Their meaning is roughly:
         <ul>
@@ -275,11 +259,11 @@
         </ul>
       </li>
       <li><strong>Component</strong></li>
-      <li><strong>Affects Version</strong>. For Bugs, assign at least one version that is known to exhibit the problem or need the change</li>
-      <li>Do not set the following fields:</li>
+      <li><strong>Affects Version</strong>. For Bugs, assign at least one version that is known to exhibit the problem or need the change
+      * Do not set the following fields:</li>
       <li><strong>Fix Version</strong>. This is assigned by committers only when resolved.</li>
-      <li><strong>Target Version</strong>. This is assigned by committers to indicate a PR has been accepted for possible fix by the target version.</li>
-      <li>Do not include a patch file; pull requests are used to propose the actual change.</li>
+      <li><strong>Target Version</strong>. This is assigned by committers to indicate a PR has been accepted for possible fix by the target version.
+      * Do not include a patch file; pull requests are used to propose the actual change.</li>
     </ul>
   </li>
   <li>If the change is a large change, consider inviting discussion on the issue at dev@bahir.apache.org first before proceeding to implement the change.</li>
@@ -297,26 +281,26 @@
 
 <p>Make sure you do not have any uncommitted changes and rebase master with latest changes from upstream:</p>
 
-<pre><code>git fetch upstream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>git fetch upstream
 git checkout master
 git rebase upstream/master
-</code></pre>
+</code></pre></div></div>
 
 <p>Now you should rebase your branch with master, to receive the upstream changes</p>
 
-<pre><code>git checkout branch
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>git checkout branch
 git rebase master
-</code></pre>
+</code></pre></div></div>
 
 <p>In both cases, you can have conflicts:</p>
 
-<pre><code>error: could not apply fa39187... something to add to patch A
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>error: could not apply fa39187... something to add to patch A
 
 When you have resolved this problem, run "git rebase --continue".
 If you prefer to skip this patch, run "git rebase --skip" instead.
 To check out the original branch and stop rebasing, run "git rebase --abort".
 Could not apply fa39187f3c3dfd2ab5faa38ac01cf3de7ce2e841... Change fake file
-</code></pre>
+</code></pre></div></div>
 
 <p>Here, Git is telling you which commit is causing the conflict (fa39187). You’re given three choices:</p>
 
@@ -354,11 +338,11 @@ Could not apply fa39187f3c3dfd2ab5faa38ac01cf3de7ce2e841... Change fake file
 
 <p>Below is an example of a good commit message</p>
 
-<pre><code>[BAHIR-130] Performance enhancements for decision tree
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>[BAHIR-130] Performance enhancements for decision tree
 
 Generate Matrix with random values through local memory
 if there is sufficient memory.
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="code-review-criteria">Code Review Criteria</h3>
 <p>Before considering how to contribute code, it’s useful to understand how code is reviewed, and why changes may be rejected. Simply put, changes that have many or large positives, and few negative effects or risks, are much more likely to be merged, and merged quickly. Risky and less valuable changes are very unlikely to be merged, and may be rejected outright rather than receive iterations of review.</p>
diff --git a/content/docs/flink/1.0/documentation/index.html b/content/docs/flink/1.0/documentation/index.html
index 3657911..baee30a 100644
--- a/content/docs/flink/1.0/documentation/index.html
+++ b/content/docs/flink/1.0/documentation/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/docs/flink/1.0/flink-streaming-activemq/index.html b/content/docs/flink/1.0/flink-streaming-activemq/index.html
index 64decc1..368dd37 100644
--- a/content/docs/flink/1.0/flink-streaming-activemq/index.html
+++ b/content/docs/flink/1.0/flink-streaming-activemq/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -218,19 +200,19 @@
 <p>This connector provides a source and sink to <a href="http://activemq.apache.org/">Apache ActiveMQ</a>™
 To use this connector, add the following dependency to your project:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-activemq_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
 <p><em>Version Compatibility</em>: This module is compatible with ActiveMQ 5.14.0.</p>
 
 <p>Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
 See how to link with them for cluster execution <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html">here</a>.</p>
 
-<p>The source class is called <code>AMQSource</code>, and the sink is <code>AMQSink</code>.</p>
+<p>The source class is called <code class="language-plaintext highlighter-rouge">AMQSource</code>, and the sink is <code class="language-plaintext highlighter-rouge">AMQSink</code>.</p>
 
   </div>
 </div>
diff --git a/content/docs/flink/1.0/flink-streaming-akka/index.html b/content/docs/flink/1.0/flink-streaming-akka/index.html
index 0ae47a7..d76a4f7 100644
--- a/content/docs/flink/1.0/flink-streaming-akka/index.html
+++ b/content/docs/flink/1.0/flink-streaming-akka/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -218,12 +200,12 @@
 <p>This connector provides a sink to <a href="http://akka.io/">Akka</a> source actors in an ActorSystem.
 To use this connector, add the following dependency to your project:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-akka_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
 <p><em>Version Compatibility</em>: This module is compatible with Akka 2.0+.</p>
 
@@ -232,18 +214,18 @@ See how to link with them for cluster execution <a href="https://ci.apache.org/p
 
 <h2 id="configuration">Configuration</h2>
 
-<p>The configurations for the Receiver Actor System in Flink Akka connector can be created using the standard typesafe <code>Config (com.typesafe.config.Config)</code> object.</p>
+<p>The configurations for the Receiver Actor System in Flink Akka connector can be created using the standard typesafe <code class="language-plaintext highlighter-rouge">Config (com.typesafe.config.Config)</code> object.</p>
 
-<p>To enable acknowledgements, the custom configuration <code>akka.remote.auto-ack</code> can be used.</p>
+<p>To enable acknowledgements, the custom configuration <code class="language-plaintext highlighter-rouge">akka.remote.auto-ack</code> can be used.</p>
 
 <p>The user can set any of the default configurations allowed by Akka as well as custom configurations allowed by the connector.</p>
 
 <p>A sample configuration can be defined as follows:</p>
 
-<pre><code>String configFile = getClass().getClassLoader()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>String configFile = getClass().getClassLoader()
       .getResource("feeder_actor.conf").getFile();
 Config config = ConfigFactory.parseFile(new File(configFile));    
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="message-types">Message Types</h2>
 
@@ -251,13 +233,13 @@ Config config = ConfigFactory.parseFile(new File(configFile));
 
 <ul>
   <li>
-    <p>message containing <code>Iterable&lt;Object&gt;</code> data</p>
+    <p>message containing <code class="language-plaintext highlighter-rouge">Iterable&lt;Object&gt;</code> data</p>
   </li>
   <li>
-    <p>message containing generic <code>Object</code> data</p>
+    <p>message containing generic <code class="language-plaintext highlighter-rouge">Object</code> data</p>
   </li>
   <li>
-    <p>message containing generic <code>Object</code> data and a <code>Timestamp</code> value passed as <code>Tuple2&lt;Object, Long&gt;</code>.</p>
+    <p>message containing generic <code class="language-plaintext highlighter-rouge">Object</code> data and a <code class="language-plaintext highlighter-rouge">Timestamp</code> value passed as <code class="language-plaintext highlighter-rouge">Tuple2&lt;Object, Long&gt;</code>.</p>
   </li>
 </ul>
 
diff --git a/content/docs/flink/1.0/flink-streaming-flume/index.html b/content/docs/flink/1.0/flink-streaming-flume/index.html
index 1404374..1ac3bd1 100644
--- a/content/docs/flink/1.0/flink-streaming-flume/index.html
+++ b/content/docs/flink/1.0/flink-streaming-flume/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -218,22 +200,22 @@
 <p>This connector provides a sink that can send data to <a href="https://flume.apache.org/">Apache Flume</a>™. To use this connector, add the
 following dependency to your project:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-flume_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
 <p><em>Version Compatibility</em>: This module is compatible with Flume 1.5.0.</p>
 
 <p>Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
 See how to link with them for cluster execution <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html">here</a>.</p>
 
-<p>To create a <code>FlumeSink</code> instantiate the following constructor:</p>
+<p>To create a <code class="language-plaintext highlighter-rouge">FlumeSink</code> instantiate the following constructor:</p>
 
-<pre><code>FlumeSink(String host, int port, SerializationSchema&lt;IN&gt; schema)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>FlumeSink(String host, int port, SerializationSchema&lt;IN&gt; schema)
+</code></pre></div></div>
 
   </div>
 </div>
diff --git a/content/docs/flink/1.0/flink-streaming-netty/index.html b/content/docs/flink/1.0/flink-streaming-netty/index.html
index dd1bad6..2190c56 100644
--- a/content/docs/flink/1.0/flink-streaming-netty/index.html
+++ b/content/docs/flink/1.0/flink-streaming-netty/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -222,8 +204,7 @@ See how to link with them for cluster execution <a href="https://ci.apache.org/p
 
 <h2 id="data-flow">Data Flow</h2>
 
-<p><code>
-+-------------+      (2)    +------------------------+
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>+-------------+      (2)    +------------------------+
 | user system |    &lt;-----   | Third Register Service |           
 +-------------+             +------------------------+
        |                                ^
@@ -233,47 +214,48 @@ See how to link with them for cluster execution <a href="https://ci.apache.org/p
 +--------------------+                  |
 | Flink Netty Source |  ----------------+
 +--------------------+         (1)
-</code></p>
+</code></pre></div></div>
 
 <p>There are three components:</p>
 
 <ul>
   <li>User System - where the data stream is coming from</li>
-  <li>Third Register Service - receive <code>Flink Netty Source</code>’s register request (ip and port)</li>
-  <li>Flink Netty Source - Netty Server for receiving pushed streaming data from <code>User System</code></li>
+  <li>Third Register Service - receive <code class="language-plaintext highlighter-rouge">Flink Netty Source</code>’s register request (ip and port)</li>
+  <li>Flink Netty Source - Netty Server for receiving pushed streaming data from <code class="language-plaintext highlighter-rouge">User System</code></li>
 </ul>
 
 <h2 id="maven-dependency">Maven Dependency</h2>
 <p>To use this connector, add the following dependency to your project:</p>
 
-<p>```</p>
-<dependency>
-  <groupid>org.apache.bahir</groupid>
-  <artifactid>flink-connector-netty_2.11</artifactid>
-  <version>1.0</version>
-</dependency>
-<p>```</p>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
+  &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
+  &lt;artifactId&gt;flink-connector-netty_2.11&lt;/artifactId&gt;
+  &lt;version&gt;1.0&lt;/version&gt;
+&lt;/dependency&gt;
+</code></pre></div></div>
 
 <h2 id="usage">Usage</h2>
 
 <p><em>Tcp Source:</em></p>
 
-<p><code>
-val env = StreamExecutionEnvironment.getExecutionEnvironment
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val env = StreamExecutionEnvironment.getExecutionEnvironment
 env.addSource(new TcpReceiverSource("msg", 7070, Some("http://localhost:9090/cb")))
-</code>
-&gt;paramKey:  the http query param key
-&gt;tryPort:   try to use this point, if this point is used then try a new port
-&gt;callbackUrl:   register connector’s ip and port to a <code>Third Register Service</code></p>
+</code></pre></div></div>
+<blockquote>
+  <p>paramKey:  the http query param key
+tryPort:   try to use this point, if this point is used then try a new port
+callbackUrl:   register connector’s ip and port to a <code class="language-plaintext highlighter-rouge">Third Register Service</code></p>
+</blockquote>
 
 <p><em>Http Source:</em></p>
 
-<p><code>
-val env = StreamExecutionEnvironment.getExecutionEnvironment
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val env = StreamExecutionEnvironment.getExecutionEnvironment
 env.addSource(new TcpReceiverSource(7070, Some("http://localhost:9090/cb")))
-</code>
-&gt;tryPort:   try to use this port, if this point is used then try a new port
-&gt;callbackUrl:   register connector’s ip and port to a <code>Third Register Service</code></p>
+</code></pre></div></div>
+<blockquote>
+  <p>tryPort:   try to use this port, if this point is used then try a new port
+callbackUrl:   register connector’s ip and port to a <code class="language-plaintext highlighter-rouge">Third Register Service</code></p>
+</blockquote>
 
 <h2 id="full-example">Full Example</h2>
 
diff --git a/content/docs/flink/1.0/flink-streaming-redis/index.html b/content/docs/flink/1.0/flink-streaming-redis/index.html
index 6e3d121..96f597a 100644
--- a/content/docs/flink/1.0/flink-streaming-redis/index.html
+++ b/content/docs/flink/1.0/flink-streaming-redis/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,12 +201,12 @@
 to <a href="http://redis.io/topics/pubsub">Redis PubSub</a>. To use this connector, add the
 following dependency to your project:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-redis_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
 <p><em>Version Compatibility</em>: This module is compatible with Redis 2.8.5.</p>
 
@@ -250,7 +232,7 @@ The sink can use three different methods for communicating with different type o
 
 <p><strong>Java:</strong></p>
 
-<pre><code>public static class RedisExampleMapper implements RedisMapper&lt;Tuple2&lt;String, String&gt;&gt;{
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>public static class RedisExampleMapper implements RedisMapper&lt;Tuple2&lt;String, String&gt;&gt;{
 
     @Override
     public RedisCommandDescription getCommandDescription() {
@@ -271,11 +253,11 @@ FlinkJedisPoolConfig conf = new FlinkJedisPoolConfig.Builder().setHost("127.0.0.
 
 DataStream&lt;String&gt; stream = ...;
 stream.addSink(new RedisSink&lt;Tuple2&lt;String, String&gt;&gt;(conf, new RedisExampleMapper());
-</code></pre>
+</code></pre></div></div>
 
 <p><strong>Scala:</strong></p>
 
-<pre><code>class RedisExampleMapper extends RedisMapper[(String, String)]{
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class RedisExampleMapper extends RedisMapper[(String, String)]{
   override def getCommandDescription: RedisCommandDescription = {
     new RedisCommandDescription(RedisCommand.HSET, "HASH_NAME")
   }
@@ -286,41 +268,41 @@ stream.addSink(new RedisSink&lt;Tuple2&lt;String, String&gt;&gt;(conf, new Redis
 }
 val conf = new FlinkJedisPoolConfig.Builder().setHost("127.0.0.1").build()
 stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
-</code></pre>
+</code></pre></div></div>
 
 <p>This example code does the same, but for Redis Cluster:</p>
 
 <p><strong>Java:</strong></p>
 
-<pre><code>FlinkJedisPoolConfig conf = new FlinkJedisPoolConfig.Builder()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>FlinkJedisPoolConfig conf = new FlinkJedisPoolConfig.Builder()
     .setNodes(new HashSet&lt;InetSocketAddress&gt;(Arrays.asList(new InetSocketAddress(5601)))).build();
 
 DataStream&lt;String&gt; stream = ...;
 stream.addSink(new RedisSink&lt;Tuple2&lt;String, String&gt;&gt;(conf, new RedisExampleMapper());
-</code></pre>
+</code></pre></div></div>
 
 <p><strong>Scala:</strong></p>
 
-<pre><code>val conf = new FlinkJedisPoolConfig.Builder().setNodes(...).build()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val conf = new FlinkJedisPoolConfig.Builder().setNodes(...).build()
 stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
-</code></pre>
+</code></pre></div></div>
 
 <p>This example shows when the Redis environment is with Sentinels:</p>
 
 <p>Java:</p>
 
-<pre><code>FlinkJedisSentinelConfig conf = new FlinkJedisSentinelConfig.Builder()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>FlinkJedisSentinelConfig conf = new FlinkJedisSentinelConfig.Builder()
     .setMasterName("master").setSentinels(...).build();
 
 DataStream&lt;String&gt; stream = ...;
 stream.addSink(new RedisSink&lt;Tuple2&lt;String, String&gt;&gt;(conf, new RedisExampleMapper());
-</code></pre>
+</code></pre></div></div>
 
 <p>Scala:</p>
 
-<pre><code>val conf = new FlinkJedisSentinelConfig.Builder().setMasterName("master").setSentinels(...).build()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val conf = new FlinkJedisSentinelConfig.Builder().setMasterName("master").setSentinels(...).build()
 stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
-</code></pre>
+</code></pre></div></div>
 
 <p>This section gives a description of all the available data types and what Redis command used for that.</p>
 
diff --git a/content/docs/flink/current/documentation/index.html b/content/docs/flink/current/documentation/index.html
index 347525f..b3d0a5f 100644
--- a/content/docs/flink/current/documentation/index.html
+++ b/content/docs/flink/current/documentation/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/docs/flink/current/flink-streaming-activemq/index.html b/content/docs/flink/current/flink-streaming-activemq/index.html
index 5442d1b..4f489a9 100644
--- a/content/docs/flink/current/flink-streaming-activemq/index.html
+++ b/content/docs/flink/current/flink-streaming-activemq/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -218,19 +200,19 @@
 <p>This connector provides a source and sink to <a href="http://activemq.apache.org/">Apache ActiveMQ</a>™
 To use this connector, add the following dependency to your project:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-activemq_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.1-SNAPSHOT&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
 <p><em>Version Compatibility</em>: This module is compatible with ActiveMQ 5.14.0.</p>
 
 <p>Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
 See how to link with them for cluster execution <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html">here</a>.</p>
 
-<p>The source class is called <code>AMQSource</code>, and the sink is <code>AMQSink</code>.</p>
+<p>The source class is called <code class="language-plaintext highlighter-rouge">AMQSource</code>, and the sink is <code class="language-plaintext highlighter-rouge">AMQSink</code>.</p>
 
   </div>
 </div>
diff --git a/content/docs/flink/current/flink-streaming-akka/index.html b/content/docs/flink/current/flink-streaming-akka/index.html
index 42f4c7b..e30f503 100644
--- a/content/docs/flink/current/flink-streaming-akka/index.html
+++ b/content/docs/flink/current/flink-streaming-akka/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -218,12 +200,12 @@
 <p>This connector provides a sink to <a href="http://akka.io/">Akka</a> source actors in an ActorSystem.
 To use this connector, add the following dependency to your project:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-akka_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.1-SNAPSHOT&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
 <p><em>Version Compatibility</em>: This module is compatible with Akka 2.0+.</p>
 
@@ -232,18 +214,18 @@ See how to link with them for cluster execution <a href="https://ci.apache.org/p
 
 <h2 id="configuration">Configuration</h2>
 
-<p>The configurations for the Receiver Actor System in Flink Akka connector can be created using the standard typesafe <code>Config (com.typesafe.config.Config)</code> object.</p>
+<p>The configurations for the Receiver Actor System in Flink Akka connector can be created using the standard typesafe <code class="language-plaintext highlighter-rouge">Config (com.typesafe.config.Config)</code> object.</p>
 
-<p>To enable acknowledgements, the custom configuration <code>akka.remote.auto-ack</code> can be used.</p>
+<p>To enable acknowledgements, the custom configuration <code class="language-plaintext highlighter-rouge">akka.remote.auto-ack</code> can be used.</p>
 
 <p>The user can set any of the default configurations allowed by Akka as well as custom configurations allowed by the connector.</p>
 
 <p>A sample configuration can be defined as follows:</p>
 
-<pre><code>String configFile = getClass().getClassLoader()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>String configFile = getClass().getClassLoader()
       .getResource("feeder_actor.conf").getFile();
 Config config = ConfigFactory.parseFile(new File(configFile));    
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="message-types">Message Types</h2>
 
@@ -251,13 +233,13 @@ Config config = ConfigFactory.parseFile(new File(configFile));
 
 <ul>
   <li>
-    <p>message containing <code>Iterable&lt;Object&gt;</code> data</p>
+    <p>message containing <code class="language-plaintext highlighter-rouge">Iterable&lt;Object&gt;</code> data</p>
   </li>
   <li>
-    <p>message containing generic <code>Object</code> data</p>
+    <p>message containing generic <code class="language-plaintext highlighter-rouge">Object</code> data</p>
   </li>
   <li>
-    <p>message containing generic <code>Object</code> data and a <code>Timestamp</code> value passed as <code>Tuple2&lt;Object, Long&gt;</code>.</p>
+    <p>message containing generic <code class="language-plaintext highlighter-rouge">Object</code> data and a <code class="language-plaintext highlighter-rouge">Timestamp</code> value passed as <code class="language-plaintext highlighter-rouge">Tuple2&lt;Object, Long&gt;</code>.</p>
   </li>
 </ul>
 
diff --git a/content/docs/flink/current/flink-streaming-flume/index.html b/content/docs/flink/current/flink-streaming-flume/index.html
index 3a1aac4..b94005d 100644
--- a/content/docs/flink/current/flink-streaming-flume/index.html
+++ b/content/docs/flink/current/flink-streaming-flume/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -218,22 +200,22 @@
 <p>This connector provides a sink that can send data to <a href="https://flume.apache.org/">Apache Flume</a>™. To use this connector, add the
 following dependency to your project:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-flume_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.1-SNAPSHOT&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
 <p><em>Version Compatibility</em>: This module is compatible with Flume 1.8.0.</p>
 
 <p>Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
 See how to link with them for cluster execution <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/linking.html">here</a>.</p>
 
-<p>To create a <code>FlumeSink</code> instantiate the following constructor:</p>
+<p>To create a <code class="language-plaintext highlighter-rouge">FlumeSink</code> instantiate the following constructor:</p>
 
-<pre><code>FlumeSink(String host, int port, SerializationSchema&lt;IN&gt; schema)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>FlumeSink(String host, int port, SerializationSchema&lt;IN&gt; schema)
+</code></pre></div></div>
 
 
   </div>
diff --git a/content/docs/flink/current/flink-streaming-influxdb/index.html b/content/docs/flink/current/flink-streaming-influxdb/index.html
index 1f5d9ca..6183430 100644
--- a/content/docs/flink/current/flink-streaming-influxdb/index.html
+++ b/content/docs/flink/current/flink-streaming-influxdb/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -218,12 +200,12 @@
 <p>This connector provides a sink that can send data to <a href="https://www.influxdata.com/">InfluxDB</a>. To use this connector, add the
 following dependency to your project:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-influxdb_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.1-SNAPSHOT&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
 <p><em>Version Compatibility</em>: This module is compatible with InfluxDB 1.3.x <br />
 <em>Requirements</em>: Java 1.8+</p>
@@ -238,10 +220,10 @@ See how to link with them for cluster execution <a href="https://ci.apache.org/p
 
 <h3 id="java-api">JAVA API</h3>
 
-<pre><code>DataStream&lt;InfluxDBPoint&gt; dataStream = ...
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>DataStream&lt;InfluxDBPoint&gt; dataStream = ...
 InfluxDBConfig influxDBConfig = InfluxDBConfig.builder(String host, String username, String password, String dbName)
 dataStream.addSink(new InfluxDBSink(influxDBConfig));
-</code></pre>
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir-flink/tree/master/flink-connector-influxdb/examples">InfluxDB Examples</a></p>
 
diff --git a/content/docs/flink/current/flink-streaming-kudu/index.html b/content/docs/flink/current/flink-streaming-kudu/index.html
index b951ac2..0bfd75b 100644
--- a/content/docs/flink/current/flink-streaming-kudu/index.html
+++ b/content/docs/flink/current/flink-streaming-kudu/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -215,103 +197,301 @@
 
 <h1 id="flink-kudu-connector">Flink Kudu Connector</h1>
 
-<p>This connector provides a source (<code>KuduInputFormat</code>) and a sink/output (<code>KuduSink</code> and <code>KuduOutputFormat</code>, respectively) that can read and write to <a href="https://kudu.apache.org/">Kudu</a>. To use this connector, add the
-following dependency to your project:</p>
+<p>This connector provides a source (<code class="language-plaintext highlighter-rouge">KuduInputFormat</code>), a sink/output
+(<code class="language-plaintext highlighter-rouge">KuduSink</code> and <code class="language-plaintext highlighter-rouge">KuduOutputFormat</code>, respectively),
+ as well a table source (<code class="language-plaintext highlighter-rouge">KuduTableSource</code>), an upsert table sink (<code class="language-plaintext highlighter-rouge">KuduTableSink</code>), and a catalog (<code class="language-plaintext highlighter-rouge">KuduCatalog</code>),
+ to allow reading and writing to <a href="https://kudu.apache.org/">Kudu</a>.</p>
 
-<pre><code>&lt;dependency&gt;
+<p>To use this connector, add the following dependency to your project:</p>
+
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-kudu_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.1-SNAPSHOT&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p><em>Version Compatibility</em>: This module is compatible with Apache Kudu <em>1.7.1</em> (last stable version).</p>
+<p><em>Version Compatibility</em>: This module is compatible with Apache Kudu <em>1.11.1</em> (last stable version) and Apache Flink 1.10.+.</p>
 
 <p>Note that the streaming connectors are not part of the binary distribution of Flink. You need to link them into your job jar for cluster execution.
-See how to link with them for cluster execution <a href="https://ci.apache.org/projects/flink/flink-docs-stable/start/dependencies.html">here</a>.</p>
+See how to link with them for cluster execution <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/projectsetup/dependencies.html">here</a>.</p>
 
 <h2 id="installing-kudu">Installing Kudu</h2>
 
 <p>Follow the instructions from the <a href="https://kudu.apache.org/docs/installation.html">Kudu Installation Guide</a>.
 Optionally, you can use the docker images provided in dockers folder.</p>
 
-<h2 id="kuduinputformat">KuduInputFormat</h2>
-
-<p>```
-ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();</p>
-
-<p>env.setParallelism(PARALLELISM);</p>
-
-<p>// create a table info object
-KuduTableInfo tableInfo = KuduTableInfo.Builder
-        .create(“books”)
-        .addColumn(KuduColumnInfo.Builder.create(“id”, Type.INT32).key(true).hashKey(true).build())
-        .addColumn(KuduColumnInfo.Builder.create(“title”, Type.STRING).build())
-        .addColumn(KuduColumnInfo.Builder.create(“author”, Type.STRING).build())
-        .addColumn(KuduColumnInfo.Builder.create(“price”, Type.DOUBLE).build())
-        .addColumn(KuduColumnInfo.Builder.create(“quantity”, Type.INT32).build())
-        .build();</p>
-
-<p>// Pass the tableInfo to the KuduInputFormat and provide kuduMaster ips
-env.createInput(new KuduInputFormat&lt;&gt;(“172.25.0.6”, tableInfo))
-        .count();</p>
-
-<p>env.execute();
-```</p>
-
-<h2 id="kuduoutputformat">KuduOutputFormat</h2>
-
-<p>```
-ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();</p>
-
-<p>env.setParallelism(PARALLELISM);</p>
-
-<p>// create a table info object
-KuduTableInfo tableInfo = KuduTableInfo.Builder
-        .create(“books”)
-        .createIfNotExist(true)
-        .replicas(1)
-        .addColumn(KuduColumnInfo.Builder.create(“id”, Type.INT32).key(true).hashKey(true).build())
-        .addColumn(KuduColumnInfo.Builder.create(“title”, Type.STRING).build())
-        .addColumn(KuduColumnInfo.Builder.create(“author”, Type.STRING).build())
-        .addColumn(KuduColumnInfo.Builder.create(“price”, Type.DOUBLE).build())
-        .addColumn(KuduColumnInfo.Builder.create(“quantity”, Type.INT32).build())
-        .build();</p>
-
-<p>…</p>
-
-<p>env.fromCollection(books)
-        .output(new KuduOutputFormat&lt;&gt;(“172.25.0.6”, tableInfo));</p>
-
-<p>env.execute();
-```</p>
+<h2 id="sql-and-table-api">SQL and Table API</h2>
 
-<h2 id="kudusink">KuduSink</h2>
+<p>The Kudu connector is fully integrated with the Flink Table and SQL APIs. Once we configure the Kudu catalog (see next section)
+we can start querying or inserting into existing Kudu tables using the Flink SQL or Table API.</p>
 
-<p>```
-StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();</p>
+<p>For more information about the possible queries please check the <a href="https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/table/sql/">official documentation</a></p>
+
+<h3 id="kudu-catalog">Kudu Catalog</h3>
+
+<p>The connector comes with a catalog implementation to handle metadata about your Kudu setup and perform table management.
+By using the Kudu catalog, you can access all the tables already created in Kudu from Flink SQL queries. The Kudu catalog only
+allows users to create or access existing Kudu tables. Tables using other data sources must be defined in other catalogs such as
+in-memory catalog or Hive catalog.</p>
+
+<p>When using the SQL CLI you can easily add the Kudu catalog to your environment yaml file:</p>
+
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>catalogs:
+  - name: kudu
+    type: kudu
+    kudu.masters: &lt;host&gt;:7051
+</code></pre></div></div>
+
+<p>Once the SQL CLI is started you can simply switch to the Kudu catalog by calling <code class="language-plaintext highlighter-rouge">USE CATALOG kudu;</code></p>
+
+<p>You can also create and use the KuduCatalog directly in the Table environment:</p>
+
+<div class="language-java highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nc">String</span> <span class="no">KUDU_MASTERS</span><span class="o">=</span><span class="s">"host1:port1,host2:port2"</span>
+<span class="nc">KuduCatalog</span> <span class="n">catalog</span> <span class="o">=</span> <span class="k">new</span> <span class="nc">KuduCatalog</span><span class="o">(</span><span class="no">KUDU_MASTERS</span><span class="o">);</span>
+<span class="n">tableEnv</span><span class="o">.</span><span class="na">registerCatalog</span><span class="o">(</span><span class="s">"kudu"</span><span class="o">,</span> <span class="n">catalog</span><span class="o">);</span>
+<span class="n">tableEnv</span><span class="o">.</span><span class="na">useCatalog</span><span class="o">(</span><span class="s">"kudu"</span><span class="o">);</span>
+</code></pre></div></div>
+
+<h3 id="ddl-operations-using-sql">DDL operations using SQL</h3>
+
+<p>It is possible to manipulate Kudu tables using SQL DDL.</p>
+
+<p>When not using the Kudu catalog, the following additional properties must be specified in the <code class="language-plaintext highlighter-rouge">WITH</code> clause:</p>
+<ul>
+  <li><code class="language-plaintext highlighter-rouge">'connector.type'='kudu'</code></li>
+  <li><code class="language-plaintext highlighter-rouge">'kudu.masters'='host1:port1,host2:port2,...'</code>: comma-delimitered list of Kudu masters</li>
+  <li><code class="language-plaintext highlighter-rouge">'kudu.table'='...'</code>: The table’s name within the Kudu database.</li>
+</ul>
+
+<p>If you have registered and are using the Kudu catalog, these properties are handled automatically.</p>
+
+<p>To create a table, the additional properties <code class="language-plaintext highlighter-rouge">kudu.primary-key-columns</code> and <code class="language-plaintext highlighter-rouge">kudu.hash-columns</code> must be specified
+as comma-delimited lists. Optionally, you can set the <code class="language-plaintext highlighter-rouge">kudu.replicas</code> property (defaults to 1).
+Other properties, such as range partitioning, cannot be configured here - for more flexibility, please use
+<code class="language-plaintext highlighter-rouge">catalog.createTable</code> as described in <a href="#Creating-a-KuduTable-directly-with-KuduCatalog">this</a> section or create the table directly in Kudu.</p>
+
+<p>The <code class="language-plaintext highlighter-rouge">NOT NULL</code> constraint can be added to any of the column definitions.
+By setting a column as a primary key, it will automatically by created with the <code class="language-plaintext highlighter-rouge">NOT NULL</code> constraint.
+Hash columns must be a subset of primary key columns.</p>
+
+<p>Kudu Catalog</p>
+
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>CREATE TABLE TestTable (
+  first STRING,
+  second STRING,
+  third INT NOT NULL
+) WITH (
+  'kudu.hash-columns' = 'first',
+  'kudu.primary-key-columns' = 'first,second'
+)
+</code></pre></div></div>
+
+<p>Other catalogs</p>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>CREATE TABLE TestTable (
+  first STRING,
+  second STRING,
+  third INT NOT NULL
+) WITH (
+  'connector.type' = 'kudu',
+  'kudu.masters' = '...',
+  'kudu.table' = 'TestTable',
+  'kudu.hash-columns' = 'first',
+  'kudu.primary-key-columns' = 'first,second'
+)
+</code></pre></div></div>
+
+<p>Renaming a table:</p>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>ALTER TABLE TestTable RENAME TO TestTableRen
+</code></pre></div></div>
+
+<p>Dropping a table:</p>
+<div class="language-sql highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">DROP</span> <span class="k">TABLE</span> <span class="n">TestTableRen</span>
+</code></pre></div></div>
+
+<h4 id="creating-a-kudutable-directly-with-kuducatalog">Creating a KuduTable directly with KuduCatalog</h4>
+
+<p>The KuduCatalog also exposes a simple <code class="language-plaintext highlighter-rouge">createTable</code> method that required only the where table configuration,
+including schema, partitioning, replication, etc. can be specified using a <code class="language-plaintext highlighter-rouge">KuduTableInfo</code> object.</p>
+
+<p>Use the <code class="language-plaintext highlighter-rouge">createTableIfNotExists</code> method, that takes a <code class="language-plaintext highlighter-rouge">ColumnSchemasFactory</code> and
+a <code class="language-plaintext highlighter-rouge">CreateTableOptionsFactory</code> parameter, that implement respectively <code class="language-plaintext highlighter-rouge">getColumnSchemas()</code>
+returning a list of Kudu <a href="https://kudu.apache.org/apidocs/org/apache/kudu/ColumnSchema.html">ColumnSchema</a> objects;
+ and  <code class="language-plaintext highlighter-rouge">getCreateTableOptions()</code> returning a
+<a href="https://kudu.apache.org/apidocs/org/apache/kudu/client/CreateTableOptions.html">CreateTableOptions</a> object.</p>
+
+<p>This example shows the creation of a table called <code class="language-plaintext highlighter-rouge">ExampleTable</code> with two columns,
+<code class="language-plaintext highlighter-rouge">first</code> being a primary key; and configuration of replicas and hash partitioning.</p>
+
+<div class="language-java highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nc">KuduTableInfo</span> <span class="n">tableInfo</span> <span class="o">=</span> <span class="nc">KuduTableInfo</span>
+    <span class="o">.</span><span class="na">forTable</span><span class="o">(</span><span class="s">"ExampleTable"</span><span class="o">)</span>
+    <span class="o">.</span><span class="na">createTableIfNotExists</span><span class="o">(</span>
+        <span class="o">()</span> <span class="o">-&gt;</span>
+            <span class="nc">Lists</span><span class="o">.</span><span class="na">newArrayList</span><span class="o">(</span>
+                <span class="k">new</span> <span class="nc">ColumnSchema</span>
+                    <span class="o">.</span><span class="na">ColumnSchemaBuilder</span><span class="o">(</span><span class="s">"first"</span><span class="o">,</span> <span class="nc">Type</span><span class="o">.</span><span class="na">INT32</span><span class="o">)</span>
+                    <span class="o">.</span><span class="na">key</span><span class="o">(</span><span class="kc">true</span><span class="o">)</span>
+                    <span class="o">.</span><span class="na">build</span><span class="o">(),</span>
+                <span class="k">new</span> <span class="nc">ColumnSchema</span>
+                    <span class="o">.</span><span class="na">ColumnSchemaBuilder</span><span class="o">(</span><span class="s">"second"</span><span class="o">,</span> <span class="nc">Type</span><span class="o">.</span><span class="na">STRING</span><span class="o">)</span>
+                    <span class="o">.</span><span class="na">build</span><span class="o">()</span>
+            <span class="o">),</span>
+        <span class="o">()</span> <span class="o">-&gt;</span> <span class="k">new</span> <span class="nc">CreateTableOptions</span><span class="o">()</span>
+            <span class="o">.</span><span class="na">setNumReplicas</span><span class="o">(</span><span class="mi">1</span><span class="o">)</span>
+            <span class="o">.</span><span class="na">addHashPartitions</span><span class="o">(</span><span class="nc">Lists</span><span class="o">.</span><span class="na">newArrayList</span><span class="o">(</span><span class="s">"first"</span><span class="o">),</span> <span class="mi">2</span><span class="o">));</span>
+
+<span class="n">catalog</span><span class="o">.</span><span class="na">createTable</span><span class="o">(</span><span class="n">tableInfo</span><span class="o">,</span> <span class="kc">false</span><span class="o">);</span>
+</code></pre></div></div>
+<p>The example uses lambda expressions to implement the functional interfaces.</p>
+
+<p>Read more about Kudu schema design in the <a href="https://kudu.apache.org/docs/schema_design.html">Kudu docs</a>.</p>
+
+<h3 id="supported-data-types">Supported data types</h3>
+<p>| Flink/SQL     | Kudu           | 
+| ————- |:————-:| 
+|    STRING     | STRING        | 
+| BOOLEAN       |    BOOL       | 
+| TINYINT       |   INT8        | 
+| SMALLINT      |  INT16        | 
+| INT           |  INT32        | 
+| BIGINT        |   INT64     |
+| FLOAT         |  FLOAT      |
+| DOUBLE        |    DOUBLE    |
+| BYTES        |    BINARY    |
+| TIMESTAMP(3)     |    UNIXTIME_MICROS |</p>
+
+<p>Note:</p>
+<ul>
+  <li><code class="language-plaintext highlighter-rouge">TIMESTAMP</code>s are fixed to a precision of 3, and the corresponding Java conversion class is <code class="language-plaintext highlighter-rouge">java.sql.Timestamp</code></li>
+  <li><code class="language-plaintext highlighter-rouge">BINARY</code> and <code class="language-plaintext highlighter-rouge">VARBINARY</code> are not yet supported - use <code class="language-plaintext highlighter-rouge">BYTES</code>, which is a <code class="language-plaintext highlighter-rouge">VARBINARY(2147483647)</code></li>
+  <li><code class="language-plaintext highlighter-rouge">CHAR</code> and <code class="language-plaintext highlighter-rouge">VARCHAR</code> are not yet supported - use <code class="language-plaintext highlighter-rouge">STRING</code>, which is a <code class="language-plaintext highlighter-rouge">VARCHAR(2147483647)</code></li>
+  <li><code class="language-plaintext highlighter-rouge">DECIMAL</code> types are not yet supported</li>
+</ul>
+
+<h3 id="known-limitations">Known limitations</h3>
+<ul>
+  <li>Data type limitations (see above).</li>
+  <li>SQL Create table: primary keys can only be set by the <code class="language-plaintext highlighter-rouge">kudu.primary-key-columns</code> property, using the
+<code class="language-plaintext highlighter-rouge">PRIMARY KEY</code> constraint is not yet possible.</li>
+  <li>SQL Create table: range partitioning is not supported.</li>
+  <li>When getting a table through the Catalog, NOT NULL and PRIMARY KEY constraints are ignored. All columns
+are described as being nullable, and not being primary keys.</li>
+  <li>Kudu tables cannot be altered through the catalog other than simple renaming</li>
+</ul>
+
+<h2 id="datastream-api">DataStream API</h2>
+
+<p>It is also possible to use the Kudu connector directly from the DataStream API however we
+encourage all users to explore the Table API as it provides a lot of useful tooling when working
+with Kudu data.</p>
+
+<h3 id="reading-tables-into-a-datastreams">Reading tables into a DataStreams</h3>
+
+<p>There are 2 main ways of reading a Kudu Table into a DataStream</p>
+<ol>
+  <li>Using the <code class="language-plaintext highlighter-rouge">KuduCatalog</code> and the Table API</li>
+  <li>Using the <code class="language-plaintext highlighter-rouge">KuduRowInputFormat</code> directly</li>
+</ol>
+
+<p>Using the <code class="language-plaintext highlighter-rouge">KuduCatalog</code> and Table API is the recommended way of reading tables as it automatically
+guarantees type safety and takes care of configuration of our readers.</p>
+
+<p>This is how it works in practice:</p>
+<div class="language-java highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nc">StreamTableEnvironment</span> <span class="n">tableEnv</span> <span class="o">=</span> <span class="nc">StreamTableEnvironment</span><span class="o">.</span><span class="na">create</span><span class="o">(</span><span class="n">streamEnv</span><span class="o">,</span> <span class="n">tableSettings</span><span class="o">);</span>
+
+<span class="n">tableEnv</span><span class="o">.</span><span class="na">registerCatalog</span><span class="o">(</span><span class="s">"kudu"</span><span class="o">,</span> <span class="k">new</span> <span class="nc">KuduCatalog</span><span class="o">(</span><span class="s">"master:port"</span><span class="o">));</span>
+<span class="n">tableEnv</span><span class="o">.</span><span class="na">useCatalog</span><span class="o">(</span><span class="s">"kudu"</span><span class="o">);</span>
+
+<span class="nc">Table</span> <span class="n">table</span> <span class="o">=</span> <span class="n">tableEnv</span><span class="o">.</span><span class="na">sqlQuery</span><span class="o">(</span><span class="s">"SELECT * FROM MyKuduTable"</span><span class="o">);</span>
+<span class="nc">DataStream</span><span class="o">&lt;</span><span class="nc">Row</span><span class="o">&gt;</span> <span class="n">rows</span> <span class="o">=</span> <span class="n">tableEnv</span><span class="o">.</span><span class="na">toAppendStream</span><span class="o">(</span><span class="n">table</span><span class="o">,</span> <span class="nc">Row</span><span class="o">.</span><span class="na">class</span><span class="o">);</span>
+</code></pre></div></div>
+
+<p>The second way of achieving the same thing is by using the <code class="language-plaintext highlighter-rouge">KuduRowInputFormat</code> directly.
+In this case we have to manually provide all information about our table:</p>
+
+<div class="language-java highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nc">KuduTableInfo</span> <span class="n">tableInfo</span> <span class="o">=</span> <span class="o">...</span>
+<span class="nc">KuduReaderConfig</span> <span class="n">readerConfig</span> <span class="o">=</span> <span class="o">...</span>
+<span class="nc">KuduRowInputFormat</span> <span class="n">inputFormat</span> <span class="o">=</span> <span class="k">new</span> <span class="nc">KuduRowInputFormat</span><span class="o">(</span><span class="n">readerConfig</span><span class="o">,</span> <span class="n">tableInfo</span><span class="o">);</span>
+
+<span class="nc">DataStream</span><span class="o">&lt;</span><span class="nc">Row</span><span class="o">&gt;</span> <span class="n">rowStream</span> <span class="o">=</span> <span class="n">env</span><span class="o">.</span><span class="na">createInput</span><span class="o">(</span><span class="n">inputFormat</span><span class="o">,</span> <span class="n">rowTypeInfo</span><span class="o">);</span>
+</code></pre></div></div>
+
+<p>At the end of the day the <code class="language-plaintext highlighter-rouge">KuduTableSource</code> is just a convenient wrapper around the <code class="language-plaintext highlighter-rouge">KuduRowInputFormat</code>.</p>
+
+<h3 id="kudu-sink">Kudu Sink</h3>
+<p>The connector provides a <code class="language-plaintext highlighter-rouge">KuduSink</code> class that can be used to consume DataStreams
+and write the results into a Kudu table.</p>
+
+<p>The constructor takes 3 or 4 arguments.</p>
+<ul>
+  <li><code class="language-plaintext highlighter-rouge">KuduWriterConfig</code> is used to specify the Kudu masters and the flush mode.</li>
+  <li><code class="language-plaintext highlighter-rouge">KuduTableInfo</code> identifies the table to be written</li>
+  <li><code class="language-plaintext highlighter-rouge">KuduOperationMapper</code> maps the records coming from the DataStream to a list of Kudu operations.</li>
+  <li><code class="language-plaintext highlighter-rouge">KuduFailureHandler</code> (optional): If you want to provide your own logic for handling writing failures.</li>
+</ul>
+
+<p>The example below shows the creation of a sink for Row type records of 3 fields. It Upserts each record.
+It is assumed that a Kudu table with columns <code class="language-plaintext highlighter-rouge">col1, col2, col3</code> called <code class="language-plaintext highlighter-rouge">AlreadyExistingTable</code> exists. Note that if this were not the case,
+we could pass a <code class="language-plaintext highlighter-rouge">KuduTableInfo</code> as described in the <a href="#creating-a-table">Catalog - Creating a table</a> section,
+and the sink would create the table with the provided configuration.</p>
+
+<div class="language-java highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nc">KuduWriterConfig</span> <span class="n">writerConfig</span> <span class="o">=</span> <span class="nc">KuduWriterConfig</span><span class="o">.</span><span class="na">Builder</span><span class="o">.</span><span class="na">setMasters</span><span class="o">(</span><span class="no">KUDU_MASTERS</span><span class="o">).</span><span class="na">build</span><span class="o">();</span>
+
+<span class="nc">KuduSink</span><span class="o">&lt;</span><span class="nc">Row</span><span class="o">&gt;</span> <span class="n">sink</span> <span class="o">=</span> <span class="k">new</span> <span class="nc">KuduSink</span><span class="o">&lt;&gt;(</span>
+    <span class="n">writerConfig</span><span class="o">,</span>
+    <span class="nc">KuduTableInfo</span><span class="o">.</span><span class="na">forTable</span><span class="o">(</span><span class="s">"AlreadyExistingTable"</span><span class="o">),</span>
+    <span class="k">new</span> <span class="nc">RowOperationMapper</span><span class="o">&lt;&gt;(</span>
+            <span class="k">new</span> <span class="nc">String</span><span class="o">[]{</span><span class="s">"col1"</span><span class="o">,</span> <span class="s">"col2"</span><span class="o">,</span> <span class="s">"col3"</span><span class="o">},</span>
+            <span class="nc">AbstractSingleOperationMapper</span><span class="o">.</span><span class="na">KuduOperation</span><span class="o">.</span><span class="na">UPSERT</span><span class="o">)</span>
+<span class="o">)</span>
+</code></pre></div></div>
+
+<h4 id="kuduoperationmapper">KuduOperationMapper</h4>
+
+<p>This section describes the Operation mapping logic in more detail.</p>
+
+<p>The connector supports insert, upsert, update, and delete operations.
+The operation to be performed can vary dynamically based on the record.
+To allow for more flexibility, it is also possible for one record to trigger
+0, 1, or more operations.
+For the highest level of control, implement the <code class="language-plaintext highlighter-rouge">KuduOperationMapper</code> interface.</p>
+
+<p>If one record from the DataStream corresponds to one table operation,
+extend the <code class="language-plaintext highlighter-rouge">AbstractSingleOperationMapper</code> class. An array of column
+names must be provided. This must match the Kudu table’s schema.</p>
+
+<p>The <code class="language-plaintext highlighter-rouge">getField</code> method must be overridden, which extracts the value for the table column whose name is
+at the <code class="language-plaintext highlighter-rouge">i</code>th place in the <code class="language-plaintext highlighter-rouge">columnNames</code> array.
+If the operation is one of (<code class="language-plaintext highlighter-rouge">CREATE, UPSERT, UPDATE, DELETE</code>)
+and doesn’t depend on the input record (constant during the life of the sink), it can be set in the constructor
+of <code class="language-plaintext highlighter-rouge">AbstractSingleOperationMapper</code>.
+It is also possible to implement your own logic by overriding the
+<code class="language-plaintext highlighter-rouge">createBaseOperation</code> method that returns a Kudu <a href="https://kudu.apache.org/apidocs/org/apache/kudu/client/Operation.html">Operation</a>.</p>
+
+<p>There are pre-defined operation mappers for Pojo, Flink Row, and Flink Tuple types for constant operation, 1-to-1 sinks.</p>
+<ul>
+  <li><code class="language-plaintext highlighter-rouge">PojoOperationMapper</code>: Each table column must correspond to a POJO field
+with the same name. The  <code class="language-plaintext highlighter-rouge">columnNames</code> array should contain those fields of the POJO that
+are present as table columns (the POJO fields can be a superset of table columns).</li>
+  <li><code class="language-plaintext highlighter-rouge">RowOperationMapper</code> and <code class="language-plaintext highlighter-rouge">TupleOperationMapper</code>: the mapping is based on position. The
+<code class="language-plaintext highlighter-rouge">i</code>th field of the Row/Tuple corresponds to the column of the table at the <code class="language-plaintext highlighter-rouge">i</code>th
+position in the <code class="language-plaintext highlighter-rouge">columnNames</code> array.</li>
+</ul>
 
-<p>env.setParallelism(PARALLELISM);</p>
+<h2 id="building-the-connector">Building the connector</h2>
 
-<p>// create a table info object
-KuduTableInfo tableInfo = KuduTableInfo.Builder
-        .create(“books”)
-        .createIfNotExist(true)
-        .replicas(1)
-        .addColumn(KuduColumnInfo.Builder.create(“id”, Type.INT32).key(true).hashKey(true).build())
-        .addColumn(KuduColumnInfo.Builder.create(“title”, Type.STRING).build())
-        .addColumn(KuduColumnInfo.Builder.create(“author”, Type.STRING).build())
-        .addColumn(KuduColumnInfo.Builder.create(“price”, Type.DOUBLE).build())
-        .addColumn(KuduColumnInfo.Builder.create(“quantity”, Type.INT32).build())
-        .build();</p>
+<p>The connector can be easily built by using maven:</p>
 
-<p>…</p>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>cd bahir-flink
+mvn clean install
+</code></pre></div></div>
 
-<p>env.fromCollection(books)
-    .addSink(new KuduSink&lt;&gt;(“172.25.0.6”, tableInfo));</p>
+<h3 id="running-the-tests">Running the tests</h3>
+
+<p>The integration tests rely on the Kudu test harness which requires the current user to be able to ssh to localhost.</p>
 
-<p>env.execute();
-```</p>
+<p>This might not work out of the box on some operating systems (such as Mac OS X).
+To solve this problem go to <em>System Preferences/Sharing</em> and enable Remote login for your user.</p>
 
   </div>
 </div>
diff --git a/content/docs/flink/current/flink-streaming-netty/index.html b/content/docs/flink/current/flink-streaming-netty/index.html
index 11e280f..3091815 100644
--- a/content/docs/flink/current/flink-streaming-netty/index.html
+++ b/content/docs/flink/current/flink-streaming-netty/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -222,8 +204,7 @@ See how to link with them for cluster execution <a href="https://ci.apache.org/p
 
 <h2 id="data-flow">Data Flow</h2>
 
-<p><code>
-+-------------+      (2)    +------------------------+
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>+-------------+      (2)    +------------------------+
 | user system |    &lt;-----   | Third Register Service |           
 +-------------+             +------------------------+
        |                                ^
@@ -233,47 +214,48 @@ See how to link with them for cluster execution <a href="https://ci.apache.org/p
 +--------------------+                  |
 | Flink Netty Source |  ----------------+
 +--------------------+         (1)
-</code></p>
+</code></pre></div></div>
 
 <p>There are three components:</p>
 
 <ul>
   <li>User System - where the data stream is coming from</li>
-  <li>Third Register Service - receive <code>Flink Netty Source</code>’s register request (ip and port)</li>
-  <li>Flink Netty Source - Netty Server for receiving pushed streaming data from <code>User System</code></li>
+  <li>Third Register Service - receive <code class="language-plaintext highlighter-rouge">Flink Netty Source</code>’s register request (ip and port)</li>
+  <li>Flink Netty Source - Netty Server for receiving pushed streaming data from <code class="language-plaintext highlighter-rouge">User System</code></li>
 </ul>
 
 <h2 id="maven-dependency">Maven Dependency</h2>
 <p>To use this connector, add the following dependency to your project:</p>
 
-<p>```</p>
-<dependency>
-  <groupid>org.apache.bahir</groupid>
-  <artifactid>flink-connector-netty_2.11</artifactid>
-  <version>1.1-SNAPSHOT</version>
-</dependency>
-<p>```</p>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
+  &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
+  &lt;artifactId&gt;flink-connector-netty_2.11&lt;/artifactId&gt;
+  &lt;version&gt;1.1-SNAPSHOT&lt;/version&gt;
+&lt;/dependency&gt;
+</code></pre></div></div>
 
 <h2 id="usage">Usage</h2>
 
 <p><em>Tcp Source:</em></p>
 
-<p><code>
-val env = StreamExecutionEnvironment.getExecutionEnvironment
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val env = StreamExecutionEnvironment.getExecutionEnvironment
 env.addSource(new TcpReceiverSource("msg", 7070, Some("http://localhost:9090/cb")))
-</code>
-&gt;paramKey:  the http query param key
-&gt;tryPort:   try to use this point, if this point is used then try a new port
-&gt;callbackUrl:   register connector’s ip and port to a <code>Third Register Service</code></p>
+</code></pre></div></div>
+<blockquote>
+  <p>paramKey:  the http query param key
+tryPort:   try to use this point, if this point is used then try a new port
+callbackUrl:   register connector’s ip and port to a <code class="language-plaintext highlighter-rouge">Third Register Service</code></p>
+</blockquote>
 
 <p><em>Http Source:</em></p>
 
-<p><code>
-val env = StreamExecutionEnvironment.getExecutionEnvironment
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val env = StreamExecutionEnvironment.getExecutionEnvironment
 env.addSource(new TcpReceiverSource(7070, Some("http://localhost:9090/cb")))
-</code>
-&gt;tryPort:   try to use this port, if this point is used then try a new port
-&gt;callbackUrl:   register connector’s ip and port to a <code>Third Register Service</code></p>
+</code></pre></div></div>
+<blockquote>
+  <p>tryPort:   try to use this port, if this point is used then try a new port
+callbackUrl:   register connector’s ip and port to a <code class="language-plaintext highlighter-rouge">Third Register Service</code></p>
+</blockquote>
 
 <h2 id="full-example">Full Example</h2>
 
diff --git a/content/docs/flink/current/flink-streaming-redis/index.html b/content/docs/flink/current/flink-streaming-redis/index.html
index df76a63..1a9feb9 100644
--- a/content/docs/flink/current/flink-streaming-redis/index.html
+++ b/content/docs/flink/current/flink-streaming-redis/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,12 +201,12 @@
 to <a href="http://redis.io/topics/pubsub">Redis PubSub</a>. To use this connector, add the
 following dependency to your project:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
   &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
   &lt;artifactId&gt;flink-connector-redis_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.1-SNAPSHOT&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
 <p><em>Version Compatibility</em>: This module is compatible with Redis 2.8.5.</p>
 
@@ -250,7 +232,7 @@ The sink can use three different methods for communicating with different type o
 
 <p><strong>Java:</strong></p>
 
-<pre><code>public static class RedisExampleMapper implements RedisMapper&lt;Tuple2&lt;String, String&gt;&gt;{
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>public static class RedisExampleMapper implements RedisMapper&lt;Tuple2&lt;String, String&gt;&gt;{
 
     @Override
     public RedisCommandDescription getCommandDescription() {
@@ -271,11 +253,11 @@ FlinkJedisPoolConfig conf = new FlinkJedisPoolConfig.Builder().setHost("127.0.0.
 
 DataStream&lt;String&gt; stream = ...;
 stream.addSink(new RedisSink&lt;Tuple2&lt;String, String&gt;&gt;(conf, new RedisExampleMapper());
-</code></pre>
+</code></pre></div></div>
 
 <p><strong>Scala:</strong></p>
 
-<pre><code>class RedisExampleMapper extends RedisMapper[(String, String)]{
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class RedisExampleMapper extends RedisMapper[(String, String)]{
   override def getCommandDescription: RedisCommandDescription = {
     new RedisCommandDescription(RedisCommand.HSET, "HASH_NAME")
   }
@@ -286,41 +268,41 @@ stream.addSink(new RedisSink&lt;Tuple2&lt;String, String&gt;&gt;(conf, new Redis
 }
 val conf = new FlinkJedisPoolConfig.Builder().setHost("127.0.0.1").build()
 stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
-</code></pre>
+</code></pre></div></div>
 
 <p>This example code does the same, but for Redis Cluster:</p>
 
 <p><strong>Java:</strong></p>
 
-<pre><code>FlinkJedisPoolConfig conf = new FlinkJedisPoolConfig.Builder()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>FlinkJedisPoolConfig conf = new FlinkJedisClusterConfig.Builder()
     .setNodes(new HashSet&lt;InetSocketAddress&gt;(Arrays.asList(new InetSocketAddress(5601)))).build();
 
 DataStream&lt;String&gt; stream = ...;
 stream.addSink(new RedisSink&lt;Tuple2&lt;String, String&gt;&gt;(conf, new RedisExampleMapper());
-</code></pre>
+</code></pre></div></div>
 
 <p><strong>Scala:</strong></p>
 
-<pre><code>val conf = new FlinkJedisPoolConfig.Builder().setNodes(...).build()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val conf = new FlinkJedisClusterConfig.Builder().setNodes(...).build()
 stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
-</code></pre>
+</code></pre></div></div>
 
 <p>This example shows when the Redis environment is with Sentinels:</p>
 
 <p>Java:</p>
 
-<pre><code>FlinkJedisSentinelConfig conf = new FlinkJedisSentinelConfig.Builder()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>FlinkJedisSentinelConfig conf = new FlinkJedisSentinelConfig.Builder()
     .setMasterName("master").setSentinels(...).build();
 
 DataStream&lt;String&gt; stream = ...;
 stream.addSink(new RedisSink&lt;Tuple2&lt;String, String&gt;&gt;(conf, new RedisExampleMapper());
-</code></pre>
+</code></pre></div></div>
 
 <p>Scala:</p>
 
-<pre><code>val conf = new FlinkJedisSentinelConfig.Builder().setMasterName("master").setSentinels(...).build()
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val conf = new FlinkJedisSentinelConfig.Builder().setMasterName("master").setSentinels(...).build()
 stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
-</code></pre>
+</code></pre></div></div>
 
 <p>This section gives a description of all the available data types and what Redis command used for that.</p>
 
@@ -342,7 +324,7 @@ stream.addSink(new RedisSink[(String, String)](conf, new RedisExampleMapper))
             </td>
         </tr>
         <tr>
-            <td>SET</td><td><a href="http://redis.io/commands/rpush">SADD</a></td>
+            <td>SET</td><td><a href="http://redis.io/commands/sadd">SADD</a></td>
         </tr>
         <tr>
             <td>PUBSUB</td><td><a href="http://redis.io/commands/publish">PUBLISH</a></td>
diff --git a/content/docs/flink/overview/index.html b/content/docs/flink/overview/index.html
index ab83496..a1ff0c1 100644
--- a/content/docs/flink/overview/index.html
+++ b/content/docs/flink/overview/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/docs/spark/2.0.0/documentation/index.html b/content/docs/spark/2.0.0/documentation/index.html
index bfd7399..74a3c8c 100644
--- a/content/docs/spark/2.0.0/documentation/index.html
+++ b/content/docs/spark/2.0.0/documentation/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/docs/spark/2.0.0/spark-sql-streaming-mqtt/index.html b/content/docs/spark/2.0.0/spark-sql-streaming-mqtt/index.html
index 81ed176..73acc0f 100644
--- a/content/docs/spark/2.0.0/spark-sql-streaming-mqtt/index.html
+++ b/content/docs/spark/2.0.0/spark-sql-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.0.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.0.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.0.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.0.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -246,29 +228,29 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <p>A SQL Stream can be created with data streams received through MQTT Server using,</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="enable-recovering-from-failures">Enable recovering from failures.</h2>
 
-<p>Setting values for option <code>localStorage</code> and <code>clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
+<p>Setting values for option <code class="language-plaintext highlighter-rouge">localStorage</code> and <code class="language-plaintext highlighter-rouge">clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .option("localStorage", "/path/to/localdir")
     .option("clientId", "some-client-id")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="scala-api">Scala API</h3>
 
 <p>An example, for scala API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
 val lines = spark.readStream
   .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
   .option("topic", topic)
@@ -287,15 +269,15 @@ val query = wordCounts.writeStream
   .start()
 
 query.awaitTermination()
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>MQTTStreamWordCount.scala</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">MQTTStreamWordCount.scala</code> for full example.</p>
 
 <h3 id="java-api">Java API</h3>
 
 <p>An example, for Java API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
 Dataset&lt;String&gt; lines = spark
         .readStream()
         .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
@@ -320,9 +302,9 @@ StreamingQuery query = wordCounts.writeStream()
         .start();
 
 query.awaitTermination();
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>JavaMQTTStreamWordCount.java</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">JavaMQTTStreamWordCount.java</code> for full example.</p>
 
 
   </div>
diff --git a/content/docs/spark/2.0.0/spark-streaming-akka/index.html b/content/docs/spark/2.0.0/spark-streaming-akka/index.html
index 60419ce..e31f4f5 100644
--- a/content/docs/spark/2.0.0/spark-streaming-akka/index.html
+++ b/content/docs/spark/2.0.0/spark-streaming-akka/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,39 +201,39 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.0.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.0.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-akka_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.0.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.0.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p>DStreams can be created with data streams received through Akka actors by using <code>AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
+<p>DStreams can be created with data streams received through Akka actors by using <code class="language-plaintext highlighter-rouge">AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends ActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends ActorReceiver {
   def receive = {
     case data: String =&gt; store(data)
   }
@@ -260,14 +242,14 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 val ssc: StreamingContext = ...
 val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends JavaActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends JavaActorReceiver {
     @Override
     public void onReceive(Object msg) throws Exception {
         store((String) msg);
@@ -277,7 +259,7 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 JavaStreamingContext jssc = ...;
 JavaDStream&lt;String&gt; lines = AkkaUtils.&lt;String&gt;createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
-</code></pre>
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-akka/examples">Akka Examples</a></p>
 
diff --git a/content/docs/spark/2.0.0/spark-streaming-mqtt/index.html b/content/docs/spark/2.0.0/spark-streaming-mqtt/index.html
index c90a519..c88e6d7 100644
--- a/content/docs/spark/2.0.0/spark-streaming-mqtt/index.html
+++ b/content/docs/spark/2.0.0/spark-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.0.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.0.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.0.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.0.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -246,19 +228,19 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-mqtt/examples">MQTT Examples</a></p>
 
diff --git a/content/docs/spark/2.0.0/spark-streaming-twitter/index.html b/content/docs/spark/2.0.0/spark-streaming-twitter/index.html
index cd79e59..f039889 100644
--- a/content/docs/spark/2.0.0/spark-streaming-twitter/index.html
+++ b/content/docs/spark/2.0.0/spark-streaming-twitter/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,47 +201,47 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.0.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.0.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-twitter_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.0.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.0.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p><code>TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
-can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code>TwitterUtils</code> class and create a DStream with <code>TwitterUtils.createStream</code> as shown below.</p>
+<p><code class="language-plaintext highlighter-rouge">TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
+can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code class="language-plaintext highlighter-rouge">TwitterUtils</code> class and create a DStream with <code class="language-plaintext highlighter-rouge">TwitterUtils.createStream</code> as shown below.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter._
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter._
 
 TwitterUtils.createStream(ssc, None)
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter.*;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter.*;
 
 TwitterUtils.createStream(jssc);
-</code></pre>
+</code></pre></div></div>
 
 <p>You can also either get the public stream, or get the filtered stream based on keywords. 
 See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-twitter/examples">Twitter Examples</a></p>
diff --git a/content/docs/spark/2.0.0/spark-streaming-zeromq/index.html b/content/docs/spark/2.0.0/spark-streaming-zeromq/index.html
index 400934e..1131cc5 100644
--- a/content/docs/spark/2.0.0/spark-streaming-zeromq/index.html
+++ b/content/docs/spark/2.0.0/spark-streaming-zeromq/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.0.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.0.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-zeromq_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.0.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.0.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -246,13 +228,13 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>val lines = ZeroMQUtils.createStream(ssc, ...)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = ZeroMQUtils.createStream(ssc, ...)
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-zeromq/examples">ZeroMQ Examples</a></p>
 
diff --git a/content/docs/spark/2.0.1/documentation/index.html b/content/docs/spark/2.0.1/documentation/index.html
index 2a6d7de..4e41a15 100644
--- a/content/docs/spark/2.0.1/documentation/index.html
+++ b/content/docs/spark/2.0.1/documentation/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/docs/spark/2.0.1/spark-sql-streaming-mqtt/index.html b/content/docs/spark/2.0.1/spark-sql-streaming-mqtt/index.html
index f67f5c5..59d77a3 100644
--- a/content/docs/spark/2.0.1/spark-sql-streaming-mqtt/index.html
+++ b/content/docs/spark/2.0.1/spark-sql-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.0.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.0.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.0.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.0.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -246,47 +228,47 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <p>A SQL Stream can be created with data streams received through MQTT Server using,</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="enable-recovering-from-failures">Enable recovering from failures.</h2>
 
-<p>Setting values for option <code>localStorage</code> and <code>clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
+<p>Setting values for option <code class="language-plaintext highlighter-rouge">localStorage</code> and <code class="language-plaintext highlighter-rouge">clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .option("localStorage", "/path/to/localdir")
     .option("clientId", "some-client-id")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="configuration-options">Configuration options.</h2>
 
 <p>This source uses <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code>brokerUrl</code> A url MqttClient connects to. Set this or <code>path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
-  <li><code>persistence</code> By default it is used for storing incoming messages on disk. If <code>memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
-  <li><code>topic</code> Topic MqttClient subscribes to.</li>
-  <li><code>clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
-  <li><code>QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
-  <li><code>username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
-  <li><code>password</code> Sets the password to use for the connection.</li>
-  <li><code>cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
-  <li><code>connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code>MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
-  <li><code>keepAlive</code> Same as <code>MqttConnectOptions.setKeepAliveInterval</code>.</li>
-  <li><code>mqttVersion</code> Same as <code>MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this or <code class="language-plaintext highlighter-rouge">path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="language-plaintext highlighter-rouge">persistence</code> By default it is used for storing incoming messages on disk. If <code class="language-plaintext highlighter-rouge">memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
+  <li><code class="language-plaintext highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="language-plaintext highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
+  <li><code class="language-plaintext highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
+  <li><code class="language-plaintext highlighter-rouge">password</code> Sets the password to use for the connection.</li>
+  <li><code class="language-plaintext highlighter-rouge">cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
+  <li><code class="language-plaintext highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
+  <li><code class="language-plaintext highlighter-rouge">keepAlive</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">mqttVersion</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
 </ul>
 
 <h3 id="scala-api">Scala API</h3>
 
 <p>An example, for scala API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
 val lines = spark.readStream
   .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
   .option("topic", topic)
@@ -305,15 +287,15 @@ val query = wordCounts.writeStream
   .start()
 
 query.awaitTermination()
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>MQTTStreamWordCount.scala</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">MQTTStreamWordCount.scala</code> for full example.</p>
 
 <h3 id="java-api">Java API</h3>
 
 <p>An example, for Java API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
 Dataset&lt;String&gt; lines = spark
         .readStream()
         .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
@@ -338,9 +320,9 @@ StreamingQuery query = wordCounts.writeStream()
         .start();
 
 query.awaitTermination();
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>JavaMQTTStreamWordCount.java</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">JavaMQTTStreamWordCount.java</code> for full example.</p>
 
 
   </div>
diff --git a/content/docs/spark/2.0.1/spark-streaming-akka/index.html b/content/docs/spark/2.0.1/spark-streaming-akka/index.html
index 19fdfe0..0762897 100644
--- a/content/docs/spark/2.0.1/spark-streaming-akka/index.html
+++ b/content/docs/spark/2.0.1/spark-streaming-akka/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,39 +201,39 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.0.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.0.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-akka_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.0.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.0.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p>DStreams can be created with data streams received through Akka actors by using <code>AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
+<p>DStreams can be created with data streams received through Akka actors by using <code class="language-plaintext highlighter-rouge">AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends ActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends ActorReceiver {
   def receive = {
     case data: String =&gt; store(data)
   }
@@ -260,14 +242,14 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 val ssc: StreamingContext = ...
 val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends JavaActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends JavaActorReceiver {
     @Override
     public void onReceive(Object msg) throws Exception {
         store((String) msg);
@@ -277,7 +259,7 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 JavaStreamingContext jssc = ...;
 JavaDStream&lt;String&gt; lines = AkkaUtils.&lt;String&gt;createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
-</code></pre>
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-akka/examples">Akka Examples</a></p>
 
diff --git a/content/docs/spark/2.0.1/spark-streaming-mqtt/index.html b/content/docs/spark/2.0.1/spark-streaming-mqtt/index.html
index 912a0cd..f3d7cb8 100644
--- a/content/docs/spark/2.0.1/spark-streaming-mqtt/index.html
+++ b/content/docs/spark/2.0.1/spark-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.0.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.0.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.0.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.0.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -247,42 +229,41 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 <p>This source uses the <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code>brokerUrl</code> A url MqttClient connects to. Set this as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
-  <li><code>storageLevel</code> By default it is used for storing incoming messages on disk.</li>
-  <li><code>topic</code> Topic MqttClient subscribes to.</li>
-  <li><code>clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
-  <li><code>QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
-  <li><code>username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
-  <li><code>password</code> Sets the password to use for the connection.</li>
-  <li><code>cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
-  <li><code>connectionTimeout</code> Sets the connection timeout, a value of 0 is interpreted as wait until client connects. See <code>MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
-  <li><code>keepAlive</code> Same as <code>MqttConnectOptions.setKeepAliveInterval</code>.</li>
-  <li><code>mqttVersion</code> Same as <code>MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="language-plaintext highlighter-rouge">storageLevel</code> By default it is used for storing incoming messages on disk.</li>
+  <li><code class="language-plaintext highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="language-plaintext highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
+  <li><code class="language-plaintext highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
+  <li><code class="language-plaintext highlighter-rouge">password</code> Sets the password to use for the connection.</li>
+  <li><code class="language-plaintext highlighter-rouge">cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
+  <li><code class="language-plaintext highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpreted as wait until client connects. See <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
+  <li><code class="language-plaintext highlighter-rouge">keepAlive</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">mqttVersion</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
 </ul>
 
 <h2 id="examples">Examples</h2>
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
+</code></pre></div></div>
 
 <p>Additional mqtt connection options can be provided:</p>
 
-<p><code>Scala
-val lines = MQTTUtils.createStream(ssc, brokerUrl, topic, storageLevel, clientId, username, password, cleanSession, qos, connectionTimeout, keepAliveInterval, mqttVersion)
-</code></p>
+<pre><code class="language-Scala">val lines = MQTTUtils.createStream(ssc, brokerUrl, topic, storageLevel, clientId, username, password, cleanSession, qos, connectionTimeout, keepAliveInterval, mqttVersion)
+</code></pre>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-mqtt/examples">MQTT Examples</a></p>
 
diff --git a/content/docs/spark/2.0.1/spark-streaming-twitter/index.html b/content/docs/spark/2.0.1/spark-streaming-twitter/index.html
index aebcb17..3d955f6 100644
--- a/content/docs/spark/2.0.1/spark-streaming-twitter/index.html
+++ b/content/docs/spark/2.0.1/spark-streaming-twitter/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,47 +201,47 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.0.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.0.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-twitter_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.0.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.0.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p><code>TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
-can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code>TwitterUtils</code> class and create a DStream with <code>TwitterUtils.createStream</code> as shown below.</p>
+<p><code class="language-plaintext highlighter-rouge">TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
+can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code class="language-plaintext highlighter-rouge">TwitterUtils</code> class and create a DStream with <code class="language-plaintext highlighter-rouge">TwitterUtils.createStream</code> as shown below.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter._
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter._
 
 TwitterUtils.createStream(ssc, None)
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter.*;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter.*;
 
 TwitterUtils.createStream(jssc);
-</code></pre>
+</code></pre></div></div>
 
 <p>You can also either get the public stream, or get the filtered stream based on keywords. 
 See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-twitter/examples">Twitter Examples</a></p>
diff --git a/content/docs/spark/2.0.1/spark-streaming-zeromq/index.html b/content/docs/spark/2.0.1/spark-streaming-zeromq/index.html
index 9c123ff..65de020 100644
--- a/content/docs/spark/2.0.1/spark-streaming-zeromq/index.html
+++ b/content/docs/spark/2.0.1/spark-streaming-zeromq/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.0.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.0.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-zeromq_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.0.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.0.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -246,13 +228,13 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>val lines = ZeroMQUtils.createStream(ssc, ...)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = ZeroMQUtils.createStream(ssc, ...)
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-zeromq/examples">ZeroMQ Examples</a></p>
 
diff --git a/content/docs/spark/2.0.2/documentation/index.html b/content/docs/spark/2.0.2/documentation/index.html
index 3cdf493..f52d7ce 100644
--- a/content/docs/spark/2.0.2/documentation/index.html
+++ b/content/docs/spark/2.0.2/documentation/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/docs/spark/2.0.2/spark-sql-streaming-mqtt/index.html b/content/docs/spark/2.0.2/spark-sql-streaming-mqtt/index.html
index 7679139..6b9b8a2 100644
--- a/content/docs/spark/2.0.2/spark-sql-streaming-mqtt/index.html
+++ b/content/docs/spark/2.0.2/spark-sql-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.0.2"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.0.2"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.2&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.0.2
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.0.2
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -246,47 +228,47 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <p>A SQL Stream can be created with data streams received through MQTT Server using,</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="enable-recovering-from-failures">Enable recovering from failures.</h2>
 
-<p>Setting values for option <code>localStorage</code> and <code>clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
+<p>Setting values for option <code class="language-plaintext highlighter-rouge">localStorage</code> and <code class="language-plaintext highlighter-rouge">clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .option("localStorage", "/path/to/localdir")
     .option("clientId", "some-client-id")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="configuration-options">Configuration options.</h2>
 
 <p>This source uses <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code>brokerUrl</code> A url MqttClient connects to. Set this or <code>path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
-  <li><code>persistence</code> By default it is used for storing incoming messages on disk. If <code>memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
-  <li><code>topic</code> Topic MqttClient subscribes to.</li>
-  <li><code>clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
-  <li><code>QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
-  <li><code>username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
-  <li><code>password</code> Sets the password to use for the connection.</li>
-  <li><code>cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
-  <li><code>connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code>MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
-  <li><code>keepAlive</code> Same as <code>MqttConnectOptions.setKeepAliveInterval</code>.</li>
-  <li><code>mqttVersion</code> Same as <code>MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this or <code class="language-plaintext highlighter-rouge">path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="language-plaintext highlighter-rouge">persistence</code> By default it is used for storing incoming messages on disk. If <code class="language-plaintext highlighter-rouge">memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
+  <li><code class="language-plaintext highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="language-plaintext highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
+  <li><code class="language-plaintext highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
+  <li><code class="language-plaintext highlighter-rouge">password</code> Sets the password to use for the connection.</li>
+  <li><code class="language-plaintext highlighter-rouge">cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
+  <li><code class="language-plaintext highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
+  <li><code class="language-plaintext highlighter-rouge">keepAlive</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">mqttVersion</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
 </ul>
 
 <h3 id="scala-api">Scala API</h3>
 
 <p>An example, for scala API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
 val lines = spark.readStream
   .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
   .option("topic", topic)
@@ -305,15 +287,15 @@ val query = wordCounts.writeStream
   .start()
 
 query.awaitTermination()
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>MQTTStreamWordCount.scala</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">MQTTStreamWordCount.scala</code> for full example.</p>
 
 <h3 id="java-api">Java API</h3>
 
 <p>An example, for Java API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
 Dataset&lt;String&gt; lines = spark
         .readStream()
         .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
@@ -338,9 +320,9 @@ StreamingQuery query = wordCounts.writeStream()
         .start();
 
 query.awaitTermination();
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>JavaMQTTStreamWordCount.java</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">JavaMQTTStreamWordCount.java</code> for full example.</p>
 
 
   </div>
diff --git a/content/docs/spark/2.0.2/spark-streaming-akka/index.html b/content/docs/spark/2.0.2/spark-streaming-akka/index.html
index ecfb9e7..565763f 100644
--- a/content/docs/spark/2.0.2/spark-streaming-akka/index.html
+++ b/content/docs/spark/2.0.2/spark-streaming-akka/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,39 +201,39 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.0.2"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.0.2"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-akka_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.2&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.0.2
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.0.2
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p>DStreams can be created with data streams received through Akka actors by using <code>AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
+<p>DStreams can be created with data streams received through Akka actors by using <code class="language-plaintext highlighter-rouge">AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends ActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends ActorReceiver {
   def receive = {
     case data: String =&gt; store(data)
   }
@@ -260,14 +242,14 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 val ssc: StreamingContext = ...
 val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends JavaActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends JavaActorReceiver {
     @Override
     public void onReceive(Object msg) throws Exception {
         store((String) msg);
@@ -277,7 +259,7 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 JavaStreamingContext jssc = ...;
 JavaDStream&lt;String&gt; lines = AkkaUtils.&lt;String&gt;createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
-</code></pre>
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-akka/examples">Akka Examples</a></p>
 
diff --git a/content/docs/spark/2.0.2/spark-streaming-mqtt/index.html b/content/docs/spark/2.0.2/spark-streaming-mqtt/index.html
index ad0a27c..bff226d 100644
--- a/content/docs/spark/2.0.2/spark-streaming-mqtt/index.html
+++ b/content/docs/spark/2.0.2/spark-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.0.2"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.0.2"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.2&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.0.2
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.0.2
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -247,42 +229,41 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 <p>This source uses the <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code>brokerUrl</code> A url MqttClient connects to. Set this as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
-  <li><code>storageLevel</code> By default it is used for storing incoming messages on disk.</li>
-  <li><code>topic</code> Topic MqttClient subscribes to.</li>
-  <li><code>clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
-  <li><code>QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
-  <li><code>username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
-  <li><code>password</code> Sets the password to use for the connection.</li>
-  <li><code>cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
-  <li><code>connectionTimeout</code> Sets the connection timeout, a value of 0 is interpreted as wait until client connects. See <code>MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
-  <li><code>keepAlive</code> Same as <code>MqttConnectOptions.setKeepAliveInterval</code>.</li>
-  <li><code>mqttVersion</code> Same as <code>MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="language-plaintext highlighter-rouge">storageLevel</code> By default it is used for storing incoming messages on disk.</li>
+  <li><code class="language-plaintext highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="language-plaintext highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
+  <li><code class="language-plaintext highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
+  <li><code class="language-plaintext highlighter-rouge">password</code> Sets the password to use for the connection.</li>
+  <li><code class="language-plaintext highlighter-rouge">cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
+  <li><code class="language-plaintext highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpreted as wait until client connects. See <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
+  <li><code class="language-plaintext highlighter-rouge">keepAlive</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">mqttVersion</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
 </ul>
 
 <h2 id="examples">Examples</h2>
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
+</code></pre></div></div>
 
 <p>Additional mqtt connection options can be provided:</p>
 
-<p><code>Scala
-val lines = MQTTUtils.createStream(ssc, brokerUrl, topic, storageLevel, clientId, username, password, cleanSession, qos, connectionTimeout, keepAliveInterval, mqttVersion)
-</code></p>
+<pre><code class="language-Scala">val lines = MQTTUtils.createStream(ssc, brokerUrl, topic, storageLevel, clientId, username, password, cleanSession, qos, connectionTimeout, keepAliveInterval, mqttVersion)
+</code></pre>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-mqtt/examples">MQTT Examples</a></p>
 
diff --git a/content/docs/spark/2.0.2/spark-streaming-twitter/index.html b/content/docs/spark/2.0.2/spark-streaming-twitter/index.html
index d4d3df0..3638af5 100644
--- a/content/docs/spark/2.0.2/spark-streaming-twitter/index.html
+++ b/content/docs/spark/2.0.2/spark-streaming-twitter/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,47 +201,47 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.0.2"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.0.2"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-twitter_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.2&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.0.2
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.0.2
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p><code>TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
-can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code>TwitterUtils</code> class and create a DStream with <code>TwitterUtils.createStream</code> as shown below.</p>
+<p><code class="language-plaintext highlighter-rouge">TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
+can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code class="language-plaintext highlighter-rouge">TwitterUtils</code> class and create a DStream with <code class="language-plaintext highlighter-rouge">TwitterUtils.createStream</code> as shown below.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter._
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter._
 
 TwitterUtils.createStream(ssc, None)
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter.*;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter.*;
 
 TwitterUtils.createStream(jssc);
-</code></pre>
+</code></pre></div></div>
 
 <p>You can also either get the public stream, or get the filtered stream based on keywords. 
 See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-twitter/examples">Twitter Examples</a></p>
diff --git a/content/docs/spark/2.0.2/spark-streaming-zeromq/index.html b/content/docs/spark/2.0.2/spark-streaming-zeromq/index.html
index b16dcfd..6607fd4 100644
--- a/content/docs/spark/2.0.2/spark-streaming-zeromq/index.html
+++ b/content/docs/spark/2.0.2/spark-streaming-zeromq/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.0.2"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.0.2"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-zeromq_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.0.2&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.0.2
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.0.2
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -246,13 +228,13 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>val lines = ZeroMQUtils.createStream(ssc, ...)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = ZeroMQUtils.createStream(ssc, ...)
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-zeromq/examples">ZeroMQ Examples</a></p>
 
diff --git a/content/docs/spark/2.1.0/documentation/index.html b/content/docs/spark/2.1.0/documentation/index.html
index ce6a7ef..b584a6d 100644
--- a/content/docs/spark/2.1.0/documentation/index.html
+++ b/content/docs/spark/2.1.0/documentation/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/docs/spark/2.1.0/spark-sql-streaming-mqtt/index.html b/content/docs/spark/2.1.0/spark-sql-streaming-mqtt/index.html
index 59adcd0..bf68ce4 100644
--- a/content/docs/spark/2.1.0/spark-sql-streaming-mqtt/index.html
+++ b/content/docs/spark/2.1.0/spark-sql-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.1.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.1.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.1.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.1.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -246,47 +228,47 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <p>A SQL Stream can be created with data streams received through MQTT Server using,</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="enable-recovering-from-failures">Enable recovering from failures.</h2>
 
-<p>Setting values for option <code>localStorage</code> and <code>clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
+<p>Setting values for option <code class="language-plaintext highlighter-rouge">localStorage</code> and <code class="language-plaintext highlighter-rouge">clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .option("localStorage", "/path/to/localdir")
     .option("clientId", "some-client-id")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="configuration-options">Configuration options.</h2>
 
 <p>This source uses <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code>brokerUrl</code> A url MqttClient connects to. Set this or <code>path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
-  <li><code>persistence</code> By default it is used for storing incoming messages on disk. If <code>memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
-  <li><code>topic</code> Topic MqttClient subscribes to.</li>
-  <li><code>clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
-  <li><code>QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
-  <li><code>username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
-  <li><code>password</code> Sets the password to use for the connection.</li>
-  <li><code>cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
-  <li><code>connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code>MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
-  <li><code>keepAlive</code> Same as <code>MqttConnectOptions.setKeepAliveInterval</code>.</li>
-  <li><code>mqttVersion</code> Same as <code>MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this or <code class="language-plaintext highlighter-rouge">path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="language-plaintext highlighter-rouge">persistence</code> By default it is used for storing incoming messages on disk. If <code class="language-plaintext highlighter-rouge">memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
+  <li><code class="language-plaintext highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="language-plaintext highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
+  <li><code class="language-plaintext highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
+  <li><code class="language-plaintext highlighter-rouge">password</code> Sets the password to use for the connection.</li>
+  <li><code class="language-plaintext highlighter-rouge">cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
+  <li><code class="language-plaintext highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
+  <li><code class="language-plaintext highlighter-rouge">keepAlive</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">mqttVersion</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
 </ul>
 
 <h3 id="scala-api">Scala API</h3>
 
 <p>An example, for scala API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
 val lines = spark.readStream
   .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
   .option("topic", topic)
@@ -305,15 +287,15 @@ val query = wordCounts.writeStream
   .start()
 
 query.awaitTermination()
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>MQTTStreamWordCount.scala</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">MQTTStreamWordCount.scala</code> for full example.</p>
 
 <h3 id="java-api">Java API</h3>
 
 <p>An example, for Java API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
 Dataset&lt;String&gt; lines = spark
         .readStream()
         .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
@@ -338,9 +320,9 @@ StreamingQuery query = wordCounts.writeStream()
         .start();
 
 query.awaitTermination();
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>JavaMQTTStreamWordCount.java</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">JavaMQTTStreamWordCount.java</code> for full example.</p>
 
 
   </div>
diff --git a/content/docs/spark/2.1.0/spark-streaming-akka/index.html b/content/docs/spark/2.1.0/spark-streaming-akka/index.html
index dd96ad7..b112b1a 100644
--- a/content/docs/spark/2.1.0/spark-streaming-akka/index.html
+++ b/content/docs/spark/2.1.0/spark-streaming-akka/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,39 +201,39 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.1.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.1.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-akka_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.1.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.1.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p>DStreams can be created with data streams received through Akka actors by using <code>AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
+<p>DStreams can be created with data streams received through Akka actors by using <code class="language-plaintext highlighter-rouge">AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends ActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends ActorReceiver {
   def receive = {
     case data: String =&gt; store(data)
   }
@@ -260,14 +242,14 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 val ssc: StreamingContext = ...
 val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends JavaActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends JavaActorReceiver {
     @Override
     public void onReceive(Object msg) throws Exception {
         store((String) msg);
@@ -277,7 +259,7 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 JavaStreamingContext jssc = ...;
 JavaDStream&lt;String&gt; lines = AkkaUtils.&lt;String&gt;createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
-</code></pre>
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-akka/examples">Akka Examples</a></p>
 
diff --git a/content/docs/spark/2.1.0/spark-streaming-mqtt/index.html b/content/docs/spark/2.1.0/spark-streaming-mqtt/index.html
index 2c9597b..90438c6 100644
--- a/content/docs/spark/2.1.0/spark-streaming-mqtt/index.html
+++ b/content/docs/spark/2.1.0/spark-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.1.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.1.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.1.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.1.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -247,42 +229,41 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 <p>This source uses the <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code>brokerUrl</code> A url MqttClient connects to. Set this as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
-  <li><code>storageLevel</code> By default it is used for storing incoming messages on disk.</li>
-  <li><code>topic</code> Topic MqttClient subscribes to.</li>
-  <li><code>clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
-  <li><code>QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
-  <li><code>username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
-  <li><code>password</code> Sets the password to use for the connection.</li>
-  <li><code>cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
-  <li><code>connectionTimeout</code> Sets the connection timeout, a value of 0 is interpreted as wait until client connects. See <code>MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
-  <li><code>keepAlive</code> Same as <code>MqttConnectOptions.setKeepAliveInterval</code>.</li>
-  <li><code>mqttVersion</code> Same as <code>MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="language-plaintext highlighter-rouge">storageLevel</code> By default it is used for storing incoming messages on disk.</li>
+  <li><code class="language-plaintext highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="language-plaintext highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
+  <li><code class="language-plaintext highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
+  <li><code class="language-plaintext highlighter-rouge">password</code> Sets the password to use for the connection.</li>
+  <li><code class="language-plaintext highlighter-rouge">cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
+  <li><code class="language-plaintext highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpreted as wait until client connects. See <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
+  <li><code class="language-plaintext highlighter-rouge">keepAlive</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">mqttVersion</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
 </ul>
 
 <h2 id="examples">Examples</h2>
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
+</code></pre></div></div>
 
 <p>Additional mqtt connection options can be provided:</p>
 
-<p><code>Scala
-val lines = MQTTUtils.createStream(ssc, brokerUrl, topic, storageLevel, clientId, username, password, cleanSession, qos, connectionTimeout, keepAliveInterval, mqttVersion)
-</code></p>
+<pre><code class="language-Scala">val lines = MQTTUtils.createStream(ssc, brokerUrl, topic, storageLevel, clientId, username, password, cleanSession, qos, connectionTimeout, keepAliveInterval, mqttVersion)
+</code></pre>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-mqtt/examples">MQTT Examples</a></p>
 
diff --git a/content/docs/spark/2.1.0/spark-streaming-twitter/index.html b/content/docs/spark/2.1.0/spark-streaming-twitter/index.html
index 3a9ee28..d136d12 100644
--- a/content/docs/spark/2.1.0/spark-streaming-twitter/index.html
+++ b/content/docs/spark/2.1.0/spark-streaming-twitter/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,47 +201,47 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.1.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.1.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-twitter_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.1.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.1.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p><code>TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
-can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code>TwitterUtils</code> class and create a DStream with <code>TwitterUtils.createStream</code> as shown below.</p>
+<p><code class="language-plaintext highlighter-rouge">TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
+can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code class="language-plaintext highlighter-rouge">TwitterUtils</code> class and create a DStream with <code class="language-plaintext highlighter-rouge">TwitterUtils.createStream</code> as shown below.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter._
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter._
 
 TwitterUtils.createStream(ssc, None)
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter.*;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter.*;
 
 TwitterUtils.createStream(jssc);
-</code></pre>
+</code></pre></div></div>
 
 <p>You can also either get the public stream, or get the filtered stream based on keywords. 
 See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-twitter/examples">Twitter Examples</a></p>
diff --git a/content/docs/spark/2.1.0/spark-streaming-zeromq/index.html b/content/docs/spark/2.1.0/spark-streaming-zeromq/index.html
index 2ef450a..42edfb9 100644
--- a/content/docs/spark/2.1.0/spark-streaming-zeromq/index.html
+++ b/content/docs/spark/2.1.0/spark-streaming-zeromq/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.1.0"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.1.0"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-zeromq_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.0&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.1.0
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.1.0
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -246,13 +228,13 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>val lines = ZeroMQUtils.createStream(ssc, ...)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = ZeroMQUtils.createStream(ssc, ...)
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-zeromq/examples">ZeroMQ Examples</a></p>
 
diff --git a/content/docs/spark/2.1.1/documentation/index.html b/content/docs/spark/2.1.1/documentation/index.html
index 20448fb..d9a3516 100644
--- a/content/docs/spark/2.1.1/documentation/index.html
+++ b/content/docs/spark/2.1.1/documentation/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/docs/spark/2.1.1/spark-sql-cloudant/index.html b/content/docs/spark/2.1.1/spark-sql-cloudant/index.html
index cab5160..9b2245d 100644
--- a/content/docs/spark/2.1.1/spark-sql-cloudant/index.html
+++ b/content/docs/spark/2.1.1/spark-sql-cloudant/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -227,35 +209,35 @@ clusters, desktop PCs, and mobile devices.</p>
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-cloudant" % "2.1.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-cloudant" % "2.1.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-cloudant_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.</p>
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>Submit a job in Python:</p>
 
-<pre><code>spark-submit  --master local[4] --jars &lt;path to cloudant-spark.jar&gt;  &lt;path to python script&gt;
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>spark-submit  --master local[4] --jars &lt;path to cloudant-spark.jar&gt;  &lt;path to python script&gt;
+</code></pre></div></div>
 
 <p>Submit a job in Scala:</p>
 
-<pre><code>spark-submit --class "&lt;your class&gt;" --master local[4] --jars &lt;path to cloudant-spark.jar&gt; &lt;path to your app jar&gt;
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>spark-submit --class "&lt;your class&gt;" --master local[4] --jars &lt;path to cloudant-spark.jar&gt; &lt;path to your app jar&gt;
+</code></pre></div></div>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -404,12 +386,11 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
   </tbody>
 </table>
 
-<p>For fast loading, views are loaded without include_docs. Thus, a derived schema will always be: <code>{id, key, value}</code>, where <code>value </code>can be a compount field. An example of loading data from a view:</p>
+<p>For fast loading, views are loaded without include_docs. Thus, a derived schema will always be: <code class="language-plaintext highlighter-rouge">{id, key, value}</code>, where <code class="language-plaintext highlighter-rouge">value </code>can be a compount field. An example of loading data from a view:</p>
 
-<p>```python
-spark.sql(“ CREATE TEMPORARY TABLE flightTable1 USING org.apache.bahir.cloudant OPTIONS ( database ‘n_flight’, view ‘_design/view/_view/AA0’)”)</p>
+<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">spark</span><span class="p">.</span><span class="n">sql</span><span class="p">(</span><span class="s">" CREATE TEMPORARY TABLE flightTable1 USING org.apache.bahir.cloudant OPTIONS ( database 'n_flight', view '_design/view/_view/AA0')"</span><span class="p">)</span>
 
-<p>```</p>
+</code></pre></div></div>
 
 <h3 id="configuration-on-cloudant-receiver-for-spark-streaming">Configuration on Cloudant Receiver for Spark Streaming</h3>
 
@@ -450,9 +431,9 @@ spark.sql(“ CREATE TEMPORARY TABLE flightTable1 USING org.apache.bahir.cloudan
   </tbody>
 </table>
 
-<h3 id="configuration-in-spark-submit-using---conf-option">Configuration in spark-submit using –conf option</h3>
+<h3 id="configuration-in-spark-submit-using-conf-option">Configuration in spark-submit using –conf option</h3>
 
-<p>The above stated configuration keys can also be set using <code>spark-submit --conf</code> option. When passing configuration in spark-submit, make sure adding “spark.” as prefix to the keys.</p>
+<p>The above stated configuration keys can also be set using <code class="language-plaintext highlighter-rouge">spark-submit --conf</code> option. When passing configuration in spark-submit, make sure adding “spark.” as prefix to the keys.</p>
 
 <h2 id="examples">Examples</h2>
 
@@ -460,61 +441,58 @@ spark.sql(“ CREATE TEMPORARY TABLE flightTable1 USING org.apache.bahir.cloudan
 
 <h4 id="using-sql-in-python">Using SQL In Python</h4>
 
-<p>```python
-spark = SparkSession\
-    .builder\
-    .appName(“Cloudant Spark SQL Example in Python using temp tables”)\
-    .config(“cloudant.host”,”ACCOUNT.cloudant.com”)\
-    .config(“cloudant.username”, “USERNAME”)\
-    .config(“cloudant.password”,”PASSWORD”)\
-    .getOrCreate()</p>
-
-<h1 id="loading-temp-table-from-cloudant-db">Loading temp table from Cloudant db</h1>
-<p>spark.sql(“ CREATE TEMPORARY TABLE airportTable USING org.apache.bahir.cloudant OPTIONS ( database ‘n_airportcodemapping’)”)
-airportData = spark.sql(“SELECT _id, airportName FROM airportTable WHERE _id &gt;= ‘CAA’ AND _id &lt;= ‘GAA’ ORDER BY _id”)
-airportData.printSchema()
-print ‘Total # of rows in airportData: ‘ + str(airportData.count())
-for code in airportData.collect():
-    print code._id
-```</p>
+<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">spark</span> <span class="o">=</span> <span class="n">SparkSession</span>\
+    <span class="p">.</span><span class="n">builder</span>\
+    <span class="p">.</span><span class="n">appName</span><span class="p">(</span><span class="s">"Cloudant Spark SQL Example in Python using temp tables"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.host"</span><span class="p">,</span><span class="s">"ACCOUNT.cloudant.com"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.username"</span><span class="p">,</span> <span class="s">"USERNAME"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.password"</span><span class="p">,</span><span class="s">"PASSWORD"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">getOrCreate</span><span class="p">()</span>
+
+
+<span class="c1"># Loading temp table from Cloudant db
+</span><span class="n">spark</span><span class="p">.</span><span class="n">sql</span><span class="p">(</span><span class="s">" CREATE TEMPORARY TABLE airportTable USING org.apache.bahir.cloudant OPTIONS ( database 'n_airportcodemapping')"</span><span class="p">)</span>
+<span class="n">airportData</span> <span class="o">=</span> <span class="n">spark</span><span class="p">.</span><span class="n">sql</span><span class="p">(</span><span class="s">"SELECT _id, airportName FROM airportTable WHERE _id &gt;= 'CAA' AND _id &lt;= 'GAA' ORDER BY _id"</span><span class="p">)</span>
+<span class="n">airportData</span><span class="p">.</span><span class="n">printSchema</span><span class="p">()</span>
+<span class="k">print</span> <span class="s">'Total # of rows in airportData: '</span> <span class="o">+</span> <span class="nb">str</span><span class="p">(</span><span class="n">airportData</span><span class="p">.</span><span class="n">count</span><span class="p">())</span>
+<span class="k">for</span> <span class="n">code</span> <span class="ow">in</span> <span class="n">airportData</span><span class="p">.</span><span class="n">collect</span><span class="p">():</span>
+    <span class="k">print</span> <span class="n">code</span><span class="p">.</span><span class="n">_id</span>
+</code></pre></div></div>
 
 <p>See <a href="examples/python/CloudantApp.py">CloudantApp.py</a> for examples.</p>
 
-<p>Submit job example:
-<code>
-spark-submit  --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.1 --conf spark.cloudant.host=ACCOUNT.cloudant.com --conf spark.cloudant.username=USERNAME --conf spark.cloudant.password=PASSWORD sql-cloudant/examples/python/CloudantApp.py
-</code></p>
+<p>Submit job example:</p>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>spark-submit  --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.1 --conf spark.cloudant.host=ACCOUNT.cloudant.com --conf spark.cloudant.username=USERNAME --conf spark.cloudant.password=PASSWORD sql-cloudant/examples/python/CloudantApp.py
+</code></pre></div></div>
 
 <h4 id="using-dataframe-in-python">Using DataFrame In Python</h4>
 
-<p>```python
-spark = SparkSession\
-    .builder\
-    .appName(“Cloudant Spark SQL Example in Python using dataframes”)\
-    .config(“cloudant.host”,”ACCOUNT.cloudant.com”)\
-    .config(“cloudant.username”, “USERNAME”)\
-    .config(“cloudant.password”,”PASSWORD”)\
-    .config(“jsonstore.rdd.partitions”, 8)\
-    .getOrCreate()</p>
-
-<h1 id="loading-dataframe-from-cloudant-db">***1. Loading dataframe from Cloudant db</h1>
-<p>df = spark.read.load(“n_airportcodemapping”, “org.apache.bahir.cloudant”)
-df.cache()
-df.printSchema()
-df.filter(df.airportName &gt;= ‘Moscow’).select(“_id”,’airportName’).show()
-df.filter(df._id &gt;= ‘CAA’).select(“_id”,’airportName’).show()	  <br />
-```</p>
+<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">spark</span> <span class="o">=</span> <span class="n">SparkSession</span>\
+    <span class="p">.</span><span class="n">builder</span>\
+    <span class="p">.</span><span class="n">appName</span><span class="p">(</span><span class="s">"Cloudant Spark SQL Example in Python using dataframes"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.host"</span><span class="p">,</span><span class="s">"ACCOUNT.cloudant.com"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.username"</span><span class="p">,</span> <span class="s">"USERNAME"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.password"</span><span class="p">,</span><span class="s">"PASSWORD"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"jsonstore.rdd.partitions"</span><span class="p">,</span> <span class="mi">8</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">getOrCreate</span><span class="p">()</span>
+
+<span class="c1"># ***1. Loading dataframe from Cloudant db
+</span><span class="n">df</span> <span class="o">=</span> <span class="n">spark</span><span class="p">.</span><span class="n">read</span><span class="p">.</span><span class="n">load</span><span class="p">(</span><span class="s">"n_airportcodemapping"</span><span class="p">,</span> <span class="s">"org.apache.bahir.cloudant"</span><span class="p">)</span>
+<span class="n">df</span><span class="p">.</span><span class="n">cache</span><span class="p">()</span>
+<span class="n">df</span><span class="p">.</span><span class="n">printSchema</span><span class="p">()</span>
+<span class="n">df</span><span class="p">.</span><span class="nb">filter</span><span class="p">(</span><span class="n">df</span><span class="p">.</span><span class="n">airportName</span> <span class="o">&gt;=</span> <span class="s">'Moscow'</span><span class="p">).</span><span class="n">select</span><span class="p">(</span><span class="s">"_id"</span><span class="p">,</span><span class="s">'airportName'</span><span class="p">).</span><span class="n">show</span><span class="p">()</span>
+<span class="n">df</span><span class="p">.</span><span class="nb">filter</span><span class="p">(</span><span class="n">df</span><span class="p">.</span><span class="n">_id</span> <span class="o">&gt;=</span> <span class="s">'CAA'</span><span class="p">).</span><span class="n">select</span><span class="p">(</span><span class="s">"_id"</span><span class="p">,</span><span class="s">'airportName'</span><span class="p">).</span><span class="n">show</span><span class="p">()</span>	    
+</code></pre></div></div>
 
 <p>See <a href="examples/python/CloudantDF.py">CloudantDF.py</a> for examples.</p>
 
 <p>In case of doing multiple operations on a dataframe (select, filter etc.),
 you should persist a dataframe. Otherwise, every operation on a dataframe will load the same data from Cloudant again.
-Persisting will also speed up computation. This statement will persist an RDD in memory: <code>df.cache()</code>.  Alternatively for large dbs to persist in memory &amp; disk, use:</p>
+Persisting will also speed up computation. This statement will persist an RDD in memory: <code class="language-plaintext highlighter-rouge">df.cache()</code>.  Alternatively for large dbs to persist in memory &amp; disk, use:</p>
 
-<p><code>python
-from pyspark import StorageLevel
-df.persist(storageLevel = StorageLevel(True, True, False, True, 1))
-</code></p>
+<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">from</span> <span class="nn">pyspark</span> <span class="kn">import</span> <span class="n">StorageLevel</span>
+<span class="n">df</span><span class="p">.</span><span class="n">persist</span><span class="p">(</span><span class="n">storageLevel</span> <span class="o">=</span> <span class="n">StorageLevel</span><span class="p">(</span><span class="bp">True</span><span class="p">,</span> <span class="bp">True</span><span class="p">,</span> <span class="bp">False</span><span class="p">,</span> <span class="bp">True</span><span class="p">,</span> <span class="mi">1</span><span class="p">))</span>
+</code></pre></div></div>
 
 <p><a href="examples/python/CloudantDFOption.py">Sample code</a> on using DataFrame option to define cloudant configuration</p>
 
@@ -522,65 +500,62 @@ df.persist(storageLevel = StorageLevel(True, True, False, True, 1))
 
 <h4 id="using-sql-in-scala">Using SQL In Scala</h4>
 
-<p>```scala
-val spark = SparkSession
-      .builder()
-      .appName(“Cloudant Spark SQL Example”)
-      .config(“cloudant.host”,”ACCOUNT.cloudant.com”)
-      .config(“cloudant.username”, “USERNAME”)
-      .config(“cloudant.password”,”PASSWORD”)
-      .getOrCreate()</p>
-
-<p>// For implicit conversions of Dataframe to RDDs
-import spark.implicits._</p>
-
-<p>// create a temp table from Cloudant db and query it using sql syntax
-spark.sql(
-    s”””
+<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">spark</span> <span class="k">=</span> <span class="nc">SparkSession</span>
+      <span class="o">.</span><span class="py">builder</span><span class="o">()</span>
+      <span class="o">.</span><span class="py">appName</span><span class="o">(</span><span class="s">"Cloudant Spark SQL Example"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.host"</span><span class="o">,</span><span class="s">"ACCOUNT.cloudant.com"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.username"</span><span class="o">,</span> <span class="s">"USERNAME"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.password"</span><span class="o">,</span><span class="s">"PASSWORD"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">getOrCreate</span><span class="o">()</span>
+
+<span class="c1">// For implicit conversions of Dataframe to RDDs</span>
+<span class="k">import</span> <span class="nn">spark.implicits._</span>
+
+<span class="c1">// create a temp table from Cloudant db and query it using sql syntax</span>
+<span class="nv">spark</span><span class="o">.</span><span class="py">sql</span><span class="o">(</span>
+    <span class="n">s</span><span class="s">"""
     |CREATE TEMPORARY TABLE airportTable
     |USING org.apache.bahir.cloudant
-    |OPTIONS ( database ‘n_airportcodemapping’)
-    “”“.stripMargin)
-// create a dataframe
-val airportData = spark.sql(“SELECT _id, airportName FROM airportTable WHERE _id &gt;= ‘CAA’ AND _id &lt;= ‘GAA’ ORDER BY _id”)
-airportData.printSchema()
-println(s”Total # of rows in airportData: “ + airportData.count())
-// convert dataframe to array of Rows, and process each row
-airportData.map(t =&gt; “code: “ + t(0) + “,name:” + t(1)).collect().foreach(println)
-```
-See <a href="examples/scala/src/main/scala/mytest/spark/CloudantApp.scala">CloudantApp.scala</a> for examples.</p>
-
-<p>Submit job example:
-<code>
-spark-submit --class org.apache.spark.examples.sql.cloudant.CloudantApp --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.1 --conf spark.cloudant.host=ACCOUNT.cloudant.com --conf spark.cloudant.username=USERNAME --conf spark.cloudant.password=PASSWORD  /path/to/spark-sql-cloudant_2.11-2.1.1-tests.jar
-</code></p>
+    |OPTIONS ( database 'n_airportcodemapping')
+    """</span><span class="o">.</span><span class="py">stripMargin</span><span class="o">)</span>
+<span class="c1">// create a dataframe</span>
+<span class="k">val</span> <span class="nv">airportData</span> <span class="k">=</span> <span class="nv">spark</span><span class="o">.</span><span class="py">sql</span><span class="o">(</span><span class="s">"SELECT _id, airportName FROM airportTable WHERE _id &gt;= 'CAA' AND _id &lt;= 'GAA' ORDER BY _id"</span><span class="o">)</span>
+<span class="nv">airportData</span><span class="o">.</span><span class="py">printSchema</span><span class="o">()</span>
+<span class="nf">println</span><span class="o">(</span><span class="n">s</span><span class="s">"Total # of rows in airportData: "</span> <span class="o">+</span> <span class="nv">airportData</span><span class="o">.</span><span class="py">count</span><span class="o">())</span>
+<span class="c1">// convert dataframe to array of Rows, and process each row</span>
+<span class="nv">airportData</span><span class="o">.</span><span class="py">map</span><span class="o">(</span><span class="n">t</span> <span class="k">=&gt;</span> <span class="s">"code: "</span> <span class="o">+</span> <span class="nf">t</span><span class="o">(</span><span class="mi">0</span><span class="o">)</span> <span class="o">+</span> <span class="s">",name:"</span> <span class="o">+</span> <span class="nf">t</span><span class="o">(</span><span class="mi">1</span><span class="o"> [...]
+</code></pre></div></div>
+<p>See <a href="examples/scala/src/main/scala/mytest/spark/CloudantApp.scala">CloudantApp.scala</a> for examples.</p>
+
+<p>Submit job example:</p>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>spark-submit --class org.apache.spark.examples.sql.cloudant.CloudantApp --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.1 --conf spark.cloudant.host=ACCOUNT.cloudant.com --conf spark.cloudant.username=USERNAME --conf spark.cloudant.password=PASSWORD  /path/to/spark-sql-cloudant_2.11-2.1.1-tests.jar
+</code></pre></div></div>
 
 <h3 id="using-dataframe-in-scala">Using DataFrame In Scala</h3>
 
-<p>```scala
-val spark = SparkSession
-      .builder()
-      .appName(“Cloudant Spark SQL Example with Dataframe”)
-      .config(“cloudant.host”,”ACCOUNT.cloudant.com”)
-      .config(“cloudant.username”, “USERNAME”)
-      .config(“cloudant.password”,”PASSWORD”)
-      .config(“createDBOnSave”,”true”) // to create a db on save
-      .config(“jsonstore.rdd.partitions”, “20”) // using 20 partitions
-      .getOrCreate()</p>
-
-<p>// 1. Loading data from Cloudant db
-val df = spark.read.format(“org.apache.bahir.cloudant”).load(“n_flight”)
-// Caching df in memory to speed computations
-// and not to retrieve data from cloudant again
-df.cache()
-df.printSchema()</p>
-
-<p>// 2. Saving dataframe to Cloudant db
-val df2 = df.filter(df(“flightSegmentId”) === “AA106”)
-    .select(“flightSegmentId”,”economyClassBaseCost”)
-df2.show()
-df2.write.format(“org.apache.bahir.cloudant”).save(“n_flight2”)
-```</p>
+<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">spark</span> <span class="k">=</span> <span class="nc">SparkSession</span>
+      <span class="o">.</span><span class="py">builder</span><span class="o">()</span>
+      <span class="o">.</span><span class="py">appName</span><span class="o">(</span><span class="s">"Cloudant Spark SQL Example with Dataframe"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.host"</span><span class="o">,</span><span class="s">"ACCOUNT.cloudant.com"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.username"</span><span class="o">,</span> <span class="s">"USERNAME"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.password"</span><span class="o">,</span><span class="s">"PASSWORD"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"createDBOnSave"</span><span class="o">,</span><span class="s">"true"</span><span class="o">)</span> <span class="c1">// to create a db on save</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"jsonstore.rdd.partitions"</span><span class="o">,</span> <span class="s">"20"</span><span class="o">)</span> <span class="c1">// using 20 partitions</span>
+      <span class="o">.</span><span class="py">getOrCreate</span><span class="o">()</span>
+
+<span class="c1">// 1. Loading data from Cloudant db</span>
+<span class="k">val</span> <span class="nv">df</span> <span class="k">=</span> <span class="nv">spark</span><span class="o">.</span><span class="py">read</span><span class="o">.</span><span class="py">format</span><span class="o">(</span><span class="s">"org.apache.bahir.cloudant"</span><span class="o">).</span><span class="py">load</span><span class="o">(</span><span class="s">"n_flight"</span><span class="o">)</span>
+<span class="c1">// Caching df in memory to speed computations</span>
+<span class="c1">// and not to retrieve data from cloudant again</span>
+<span class="nv">df</span><span class="o">.</span><span class="py">cache</span><span class="o">()</span>
+<span class="nv">df</span><span class="o">.</span><span class="py">printSchema</span><span class="o">()</span>
+
+<span class="c1">// 2. Saving dataframe to Cloudant db</span>
+<span class="k">val</span> <span class="nv">df2</span> <span class="k">=</span> <span class="nv">df</span><span class="o">.</span><span class="py">filter</span><span class="o">(</span><span class="nf">df</span><span class="o">(</span><span class="s">"flightSegmentId"</span><span class="o">)</span> <span class="o">===</span> <span class="s">"AA106"</span><span class="o">)</span>
+    <span class="o">.</span><span class="py">select</span><span class="o">(</span><span class="s">"flightSegmentId"</span><span class="o">,</span><span class="s">"economyClassBaseCost"</span><span class="o">)</span>
+<span class="nv">df2</span><span class="o">.</span><span class="py">show</span><span class="o">()</span>
+<span class="nv">df2</span><span class="o">.</span><span class="py">write</span><span class="o">.</span><span class="py">format</span><span class="o">(</span><span class="s">"org.apache.bahir.cloudant"</span><span class="o">).</span><span class="py">save</span><span class="o">(</span><span class="s">"n_flight2"</span><span class="o">)</span>
+</code></pre></div></div>
 
 <p>See <a href="examples/scala/src/main/scala/mytest/spark/CloudantDF.scala">CloudantDF.scala</a> for examples.</p>
 
@@ -588,49 +563,47 @@ df2.write.format(“org.apache.bahir.cloudant”).save(“n_flight2”)
 
 <h3 id="using-streams-in-scala">Using Streams In Scala</h3>
 
-<p>```scala
-val ssc = new StreamingContext(sparkConf, Seconds(10))
-val changes = ssc.receiverStream(new CloudantReceiver(Map(
-  “cloudant.host” -&gt; “ACCOUNT.cloudant.com”,
-  “cloudant.username” -&gt; “USERNAME”,
-  “cloudant.password” -&gt; “PASSWORD”,
-  “database” -&gt; “n_airportcodemapping”)))</p>
-
-<p>changes.foreachRDD((rdd: RDD[String], time: Time) =&gt; {
-  // Get the singleton instance of SparkSession
-  val spark = SparkSessionSingleton.getInstance(rdd.sparkContext.getConf)</p>
-
-<p>println(s”========= $time =========”)
-  // Convert RDD[String] to DataFrame
-  val changesDataFrame = spark.read.json(rdd)
-  if (!changesDataFrame.schema.isEmpty) {
-    changesDataFrame.printSchema()
-    changesDataFrame.select(“*”).show()
-    ….
-  }
-})
-ssc.start()
-// run streaming for 120 secs
-Thread.sleep(120000L)
-ssc.stop(true)</p>
-
-<p>```</p>
+<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">ssc</span> <span class="k">=</span> <span class="k">new</span> <span class="nc">StreamingContext</span><span class="o">(</span><span class="n">sparkConf</span><span class="o">,</span> <span class="nc">Seconds</span><span class="o">(</span><span class="mi">10</span><span class="o">))</span>
+<span class="k">val</span> <span class="nv">changes</span> <span class="k">=</span> <span class="nv">ssc</span><span class="o">.</span><span class="py">receiverStream</span><span class="o">(</span><span class="k">new</span> <span class="nc">CloudantReceiver</span><span class="o">(</span><span class="nc">Map</span><span class="o">(</span>
+  <span class="s">"cloudant.host"</span> <span class="o">-&gt;</span> <span class="s">"ACCOUNT.cloudant.com"</span><span class="o">,</span>
+  <span class="s">"cloudant.username"</span> <span class="o">-&gt;</span> <span class="s">"USERNAME"</span><span class="o">,</span>
+  <span class="s">"cloudant.password"</span> <span class="o">-&gt;</span> <span class="s">"PASSWORD"</span><span class="o">,</span>
+  <span class="s">"database"</span> <span class="o">-&gt;</span> <span class="s">"n_airportcodemapping"</span><span class="o">)))</span>
+
+<span class="nv">changes</span><span class="o">.</span><span class="py">foreachRDD</span><span class="o">((</span><span class="n">rdd</span><span class="k">:</span> <span class="kt">RDD</span><span class="o">[</span><span class="kt">String</span><span class="o">],</span> <span class="n">time</span><span class="k">:</span> <span class="kt">Time</span><span class="o">)</span> <span class="k">=&gt;</span> <span class="o">{</span>
+  <span class="c1">// Get the singleton instance of SparkSession</span>
+  <span class="k">val</span> <span class="nv">spark</span> <span class="k">=</span> <span class="nv">SparkSessionSingleton</span><span class="o">.</span><span class="py">getInstance</span><span class="o">(</span><span class="nv">rdd</span><span class="o">.</span><span class="py">sparkContext</span><span class="o">.</span><span class="py">getConf</span><span class="o">)</span>
+
+  <span class="nf">println</span><span class="o">(</span><span class="n">s</span><span class="s">"========= $time ========="</span><span class="o">)</span>
+  <span class="c1">// Convert RDD[String] to DataFrame</span>
+  <span class="k">val</span> <span class="nv">changesDataFrame</span> <span class="k">=</span> <span class="nv">spark</span><span class="o">.</span><span class="py">read</span><span class="o">.</span><span class="py">json</span><span class="o">(</span><span class="n">rdd</span><span class="o">)</span>
+  <span class="nf">if</span> <span class="o">(!</span><span class="nv">changesDataFrame</span><span class="o">.</span><span class="py">schema</span><span class="o">.</span><span class="py">isEmpty</span><span class="o">)</span> <span class="o">{</span>
+    <span class="nv">changesDataFrame</span><span class="o">.</span><span class="py">printSchema</span><span class="o">()</span>
+    <span class="nv">changesDataFrame</span><span class="o">.</span><span class="py">select</span><span class="o">(</span><span class="s">"*"</span><span class="o">).</span><span class="py">show</span><span class="o">()</span>
+    <span class="o">....</span>
+  <span class="o">}</span>
+<span class="o">})</span>
+<span class="nv">ssc</span><span class="o">.</span><span class="py">start</span><span class="o">()</span>
+<span class="c1">// run streaming for 120 secs</span>
+<span class="nv">Thread</span><span class="o">.</span><span class="py">sleep</span><span class="o">(</span><span class="mi">120000L</span><span class="o">)</span>
+<span class="nv">ssc</span><span class="o">.</span><span class="py">stop</span><span class="o">(</span><span class="kc">true</span><span class="o">)</span>
+
+</code></pre></div></div>
 
 <p>See <a href="examples/scala/src/main/scala/mytest/spark/CloudantStreaming.scala">CloudantStreaming.scala</a> for examples.</p>
 
 <p>By default, Spark Streaming will load all documents from a database. If you want to limit the loading to
-specific documents, use <code>selector</code> option of <code>CloudantReceiver</code> and specify your conditions
+specific documents, use <code class="language-plaintext highlighter-rouge">selector</code> option of <code class="language-plaintext highlighter-rouge">CloudantReceiver</code> and specify your conditions
 (See <a href="examples/scala/src/main/scala/mytest/spark/CloudantStreamingSelector.scala">CloudantStreamingSelector.scala</a>
 example for more details):</p>
 
-<p><code>scala
-val changes = ssc.receiverStream(new CloudantReceiver(Map(
-  "cloudant.host" -&gt; "ACCOUNT.cloudant.com",
-  "cloudant.username" -&gt; "USERNAME",
-  "cloudant.password" -&gt; "PASSWORD",
-  "database" -&gt; "sales",
-  "selector" -&gt; "{\"month\":\"May\", \"rep\":\"John\"}")))
-</code></p>
+<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">changes</span> <span class="k">=</span> <span class="nv">ssc</span><span class="o">.</span><span class="py">receiverStream</span><span class="o">(</span><span class="k">new</span> <span class="nc">CloudantReceiver</span><span class="o">(</span><span class="nc">Map</span><span class="o">(</span>
+  <span class="s">"cloudant.host"</span> <span class="o">-&gt;</span> <span class="s">"ACCOUNT.cloudant.com"</span><span class="o">,</span>
+  <span class="s">"cloudant.username"</span> <span class="o">-&gt;</span> <span class="s">"USERNAME"</span><span class="o">,</span>
+  <span class="s">"cloudant.password"</span> <span class="o">-&gt;</span> <span class="s">"PASSWORD"</span><span class="o">,</span>
+  <span class="s">"database"</span> <span class="o">-&gt;</span> <span class="s">"sales"</span><span class="o">,</span>
+  <span class="s">"selector"</span> <span class="o">-&gt;</span> <span class="s">"{\"month\":\"May\", \"rep\":\"John\"}"</span><span class="o">)))</span>
+</code></pre></div></div>
 
   </div>
 </div>
diff --git a/content/docs/spark/2.1.1/spark-sql-streaming-akka/index.html b/content/docs/spark/2.1.1/spark-sql-streaming-akka/index.html
index d5b2817..5c79bc9 100644
--- a/content/docs/spark/2.1.1/spark-sql-streaming-akka/index.html
+++ b/content/docs/spark/2.1.1/spark-sql-streaming-akka/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-akka" % "2.1.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-akka" % "2.1.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-streaming-akka_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-akka_2.11:2.1.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-akka_2.11:2.1.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -246,37 +228,37 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <p>A SQL Stream can be created with data streams received from Akka Feeder actor using,</p>
 
-<pre><code>    sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>    sqlContext.readStream
             .format("org.apache.bahir.sql.streaming.akka.AkkaStreamSourceProvider")
             .option("urlOfPublisher", "feederActorUri")
             .load()
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="enable-recovering-from-failures">Enable recovering from failures.</h2>
 
-<p>Setting values for option <code>persistenceDirPath</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
+<p>Setting values for option <code class="language-plaintext highlighter-rouge">persistenceDirPath</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
 
-<pre><code>    sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>    sqlContext.readStream
             .format("org.apache.bahir.sql.streaming.akka.AkkaStreamSourceProvider")
             .option("urlOfPublisher", "feederActorUri")
             .option("persistenceDirPath", "/path/to/localdir")
             .load()
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="configuration-options">Configuration options.</h2>
 
 <p>This source uses <a href="http://doc.akka.io/api/akka/2.4/akka/actor/Actor.html">Akka Actor api</a>.</p>
 
 <ul>
-  <li><code>urlOfPublisher</code> The url of Publisher or Feeder actor that the Receiver actor connects to. Set this as the tcp url of the Publisher or Feeder actor.</li>
-  <li><code>persistenceDirPath</code> By default it is used for storing incoming messages on disk.</li>
+  <li><code class="language-plaintext highlighter-rouge">urlOfPublisher</code> The url of Publisher or Feeder actor that the Receiver actor connects to. Set this as the tcp url of the Publisher or Feeder actor.</li>
+  <li><code class="language-plaintext highlighter-rouge">persistenceDirPath</code> By default it is used for storing incoming messages on disk.</li>
 </ul>
 
 <h3 id="scala-api">Scala API</h3>
 
 <p>An example, for scala API to count words from incoming message stream.</p>
 
-<pre><code>    // Create DataFrame representing the stream of input lines from connection
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>    // Create DataFrame representing the stream of input lines from connection
     // to publisher or feeder actor
     val lines = spark.readStream
                 .format("org.apache.bahir.sql.streaming.akka.AkkaStreamSourceProvider")
@@ -296,15 +278,15 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
                 .start()
 
     query.awaitTermination()
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>AkkaStreamWordCount.scala</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">AkkaStreamWordCount.scala</code> for full example.</p>
 
 <h3 id="java-api">Java API</h3>
 
 <p>An example, for Java API to count words from incoming message stream.</p>
 
-<pre><code>    // Create DataFrame representing the stream of input lines from connection
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>    // Create DataFrame representing the stream of input lines from connection
     // to publisher or feeder actor
     Dataset&lt;String&gt; lines = spark
                             .readStream()
@@ -330,9 +312,9 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
                             .start();
 
     query.awaitTermination();   
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>JavaAkkaStreamWordCount.java</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">JavaAkkaStreamWordCount.java</code> for full example.</p>
 
   </div>
 </div>
diff --git a/content/docs/spark/2.1.1/spark-sql-streaming-mqtt/index.html b/content/docs/spark/2.1.1/spark-sql-streaming-mqtt/index.html
index 642df95..06b6f8b 100644
--- a/content/docs/spark/2.1.1/spark-sql-streaming-mqtt/index.html
+++ b/content/docs/spark/2.1.1/spark-sql-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.1.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.1.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.1.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.1.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -246,47 +228,47 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <p>A SQL Stream can be created with data streams received through MQTT Server using,</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="enable-recovering-from-failures">Enable recovering from failures.</h2>
 
-<p>Setting values for option <code>localStorage</code> and <code>clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
+<p>Setting values for option <code class="language-plaintext highlighter-rouge">localStorage</code> and <code class="language-plaintext highlighter-rouge">clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .option("localStorage", "/path/to/localdir")
     .option("clientId", "some-client-id")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="configuration-options">Configuration options.</h2>
 
 <p>This source uses <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code>brokerUrl</code> A url MqttClient connects to. Set this or <code>path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
-  <li><code>persistence</code> By default it is used for storing incoming messages on disk. If <code>memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
-  <li><code>topic</code> Topic MqttClient subscribes to.</li>
-  <li><code>clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
-  <li><code>QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
-  <li><code>username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
-  <li><code>password</code> Sets the password to use for the connection.</li>
-  <li><code>cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
-  <li><code>connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code>MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
-  <li><code>keepAlive</code> Same as <code>MqttConnectOptions.setKeepAliveInterval</code>.</li>
-  <li><code>mqttVersion</code> Same as <code>MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this or <code class="language-plaintext highlighter-rouge">path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="language-plaintext highlighter-rouge">persistence</code> By default it is used for storing incoming messages on disk. If <code class="language-plaintext highlighter-rouge">memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
+  <li><code class="language-plaintext highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="language-plaintext highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
+  <li><code class="language-plaintext highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
+  <li><code class="language-plaintext highlighter-rouge">password</code> Sets the password to use for the connection.</li>
+  <li><code class="language-plaintext highlighter-rouge">cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
+  <li><code class="language-plaintext highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
+  <li><code class="language-plaintext highlighter-rouge">keepAlive</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">mqttVersion</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
 </ul>
 
 <h3 id="scala-api">Scala API</h3>
 
 <p>An example, for scala API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
 val lines = spark.readStream
   .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
   .option("topic", topic)
@@ -305,15 +287,15 @@ val query = wordCounts.writeStream
   .start()
 
 query.awaitTermination()
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>MQTTStreamWordCount.scala</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">MQTTStreamWordCount.scala</code> for full example.</p>
 
 <h3 id="java-api">Java API</h3>
 
 <p>An example, for Java API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
 Dataset&lt;String&gt; lines = spark
         .readStream()
         .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
@@ -338,9 +320,9 @@ StreamingQuery query = wordCounts.writeStream()
         .start();
 
 query.awaitTermination();
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>JavaMQTTStreamWordCount.java</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">JavaMQTTStreamWordCount.java</code> for full example.</p>
 
   </div>
 </div>
diff --git a/content/docs/spark/2.1.1/spark-streaming-akka/index.html b/content/docs/spark/2.1.1/spark-streaming-akka/index.html
index eba5792..f489fca 100644
--- a/content/docs/spark/2.1.1/spark-streaming-akka/index.html
+++ b/content/docs/spark/2.1.1/spark-streaming-akka/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,39 +201,39 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.1.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-akka" % "2.1.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-akka_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.1.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-akka_2.11:2.1.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p>DStreams can be created with data streams received through Akka actors by using <code>AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
+<p>DStreams can be created with data streams received through Akka actors by using <code class="language-plaintext highlighter-rouge">AkkaUtils.createStream(ssc, actorProps, actor-name)</code>.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends ActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends ActorReceiver {
   def receive = {
     case data: String =&gt; store(data)
   }
@@ -260,14 +242,14 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 val ssc: StreamingContext = ...
 val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>class CustomActor extends JavaActorReceiver {
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class CustomActor extends JavaActorReceiver {
     @Override
     public void onReceive(Object msg) throws Exception {
         store((String) msg);
@@ -277,7 +259,7 @@ this actor can be configured to handle failures, etc.</p>
 // A new input stream can be created with this custom actor as
 JavaStreamingContext jssc = ...;
 JavaDStream&lt;String&gt; lines = AkkaUtils.&lt;String&gt;createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
-</code></pre>
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-akka/examples">Akka Examples</a></p>
 
diff --git a/content/docs/spark/2.1.1/spark-streaming-mqtt/index.html b/content/docs/spark/2.1.1/spark-streaming-mqtt/index.html
index c70bee3..b7ac84b 100644
--- a/content/docs/spark/2.1.1/spark-streaming-mqtt/index.html
+++ b/content/docs/spark/2.1.1/spark-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.1.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-mqtt" % "2.1.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.1.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.1.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -247,46 +229,45 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 <p>This source uses the <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code>brokerUrl</code> A url MqttClient connects to. Set this as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
-  <li><code>storageLevel</code> By default it is used for storing incoming messages on disk.</li>
-  <li><code>topic</code> Topic MqttClient subscribes to.</li>
-  <li><code>topics</code> List of topics MqttClient subscribes to.</li>
-  <li><code>clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
-  <li><code>QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
-  <li><code>username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
-  <li><code>password</code> Sets the password to use for the connection.</li>
-  <li><code>cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
-  <li><code>connectionTimeout</code> Sets the connection timeout, a value of 0 is interpreted as wait until client connects. See <code>MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
-  <li><code>keepAlive</code> Same as <code>MqttConnectOptions.setKeepAliveInterval</code>.</li>
-  <li><code>mqttVersion</code> Same as <code>MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="language-plaintext highlighter-rouge">storageLevel</code> By default it is used for storing incoming messages on disk.</li>
+  <li><code class="language-plaintext highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">topics</code> List of topics MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="language-plaintext highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
+  <li><code class="language-plaintext highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
+  <li><code class="language-plaintext highlighter-rouge">password</code> Sets the password to use for the connection.</li>
+  <li><code class="language-plaintext highlighter-rouge">cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
+  <li><code class="language-plaintext highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpreted as wait until client connects. See <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
+  <li><code class="language-plaintext highlighter-rouge">keepAlive</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">mqttVersion</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
 </ul>
 
 <h2 id="examples">Examples</h2>
 
 <h3 id="scala-api">Scala API</h3>
 
-<p>You need to extend <code>ActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">ActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = MQTTUtils.createStream(ssc, brokerUrl, topic)
 val lines = MQTTUtils.createPairedStream(ssc, brokerUrl, topic)
-</code></pre>
+</code></pre></div></div>
 
 <p>Additional mqtt connection options can be provided:</p>
 
-<p><code>Scala
-val lines = MQTTUtils.createStream(ssc, brokerUrl, topic, storageLevel, clientId, username, password, cleanSession, qos, connectionTimeout, keepAliveInterval, mqttVersion)
+<pre><code class="language-Scala">val lines = MQTTUtils.createStream(ssc, brokerUrl, topic, storageLevel, clientId, username, password, cleanSession, qos, connectionTimeout, keepAliveInterval, mqttVersion)
 val lines = MQTTUtils.createPairedStream(ssc, brokerUrl, topics, storageLevel, clientId, username, password, cleanSession, qos, connectionTimeout, keepAliveInterval, mqttVersion)
-</code></p>
+</code></pre>
 
 <h3 id="java-api">Java API</h3>
 
-<p>You need to extend <code>JavaActorReceiver</code> so as to store received data into Spark using <code>store(...)</code> methods. The supervisor strategy of
+<p>You need to extend <code class="language-plaintext highlighter-rouge">JavaActorReceiver</code> so as to store received data into Spark using <code class="language-plaintext highlighter-rouge">store(...)</code> methods. The supervisor strategy of
 this actor can be configured to handle failures, etc.</p>
 
-<pre><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = MQTTUtils.createStream(jssc, brokerUrl, topic);
 JavaReceiverInputDStream&lt;Tuple2&lt;String, String&gt;&gt; lines = MQTTUtils.createPairedStream(jssc, brokerUrl, topics);
-</code></pre>
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-mqtt/examples">MQTT Examples</a></p>
 
diff --git a/content/docs/spark/2.1.1/spark-streaming-pubsub/index.html b/content/docs/spark/2.1.1/spark-streaming-pubsub/index.html
index 38617d5..7af0ed5 100644
--- a/content/docs/spark/2.1.1/spark-streaming-pubsub/index.html
+++ b/content/docs/spark/2.1.1/spark-streaming-pubsub/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,48 +201,50 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-pubsub" % "2.1.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-pubsub" % "2.1.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-pubsub_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-pubsub_2.11:2.1.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-pubsub_2.11:2.1.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p>First you need to create credential by SparkGCPCredentials, it support four type of credentials
-* application default
-    <code>SparkGCPCredentials.builder.build()</code>
-* json type service account
-    <code>SparkGCPCredentials.builder.jsonServiceAccount(PATH_TO_JSON_KEY).build()</code>
-* p12 type service account
-    <code>SparkGCPCredentials.builder.p12ServiceAccount(PATH_TO_P12_KEY, EMAIL_ACCOUNT).build()</code>
-* metadata service account(running on dataproc)
-    <code>SparkGCPCredentials.builder.metadataServiceAccount().build()</code></p>
+<p>First you need to create credential by SparkGCPCredentials, it support four type of credentials</p>
+<ul>
+  <li>application default
+  <code class="language-plaintext highlighter-rouge">SparkGCPCredentials.builder.build()</code></li>
+  <li>json type service account
+  <code class="language-plaintext highlighter-rouge">SparkGCPCredentials.builder.jsonServiceAccount(PATH_TO_JSON_KEY).build()</code></li>
+  <li>p12 type service account
+  <code class="language-plaintext highlighter-rouge">SparkGCPCredentials.builder.p12ServiceAccount(PATH_TO_P12_KEY, EMAIL_ACCOUNT).build()</code></li>
+  <li>metadata service account(running on dataproc)
+  <code class="language-plaintext highlighter-rouge">SparkGCPCredentials.builder.metadataServiceAccount().build()</code></li>
+</ul>
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>val lines = PubsubUtils.createStream(ssc, projectId, subscriptionName, credential, ..)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = PubsubUtils.createStream(ssc, projectId, subscriptionName, credential, ..)
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>JavaDStream&lt;SparkPubsubMessage&gt; lines = PubsubUtils.createStream(jssc, projectId, subscriptionName, credential...)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;SparkPubsubMessage&gt; lines = PubsubUtils.createStream(jssc, projectId, subscriptionName, credential...)
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="streaming-pubsub/examples">Google Cloud Pubsub Examples</a></p>
 
diff --git a/content/docs/spark/2.1.1/spark-streaming-twitter/index.html b/content/docs/spark/2.1.1/spark-streaming-twitter/index.html
index 57e2849..c60fe74 100644
--- a/content/docs/spark/2.1.1/spark-streaming-twitter/index.html
+++ b/content/docs/spark/2.1.1/spark-streaming-twitter/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,47 +201,47 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.1.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter" % "2.1.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-twitter_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.1.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-twitter_2.11:2.1.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
 <h2 id="examples">Examples</h2>
 
-<p><code>TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
-can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code>TwitterUtils</code> class and create a DStream with <code>TwitterUtils.createStream</code> as shown below.</p>
+<p><code class="language-plaintext highlighter-rouge">TwitterUtils</code> uses Twitter4j to get the public stream of tweets using <a href="https://dev.twitter.com/docs/streaming-apis">Twitter’s Streaming API</a>. Authentication information
+can be provided by any of the <a href="http://twitter4j.org/en/configuration.html">methods</a> supported by Twitter4J library. You can import the <code class="language-plaintext highlighter-rouge">TwitterUtils</code> class and create a DStream with <code class="language-plaintext highlighter-rouge">TwitterUtils.createStream</code> as shown below.</p>
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter._
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter._
 
 TwitterUtils.createStream(ssc, None)
-</code></pre>
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>import org.apache.spark.streaming.twitter.*;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import org.apache.spark.streaming.twitter.*;
 
 TwitterUtils.createStream(jssc);
-</code></pre>
+</code></pre></div></div>
 
 <p>You can also either get the public stream, or get the filtered stream based on keywords.
 See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-twitter/examples">Twitter Examples</a></p>
diff --git a/content/docs/spark/2.1.1/spark-streaming-zeromq/index.html b/content/docs/spark/2.1.1/spark-streaming-zeromq/index.html
index 493cf2f..eecb317 100644
--- a/content/docs/spark/2.1.1/spark-streaming-zeromq/index.html
+++ b/content/docs/spark/2.1.1/spark-streaming-zeromq/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.1.1"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-streaming-zeromq" % "2.1.1"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-streaming-zeromq_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.1&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.1.1
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-streaming-zeromq_2.11:2.1.1
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is cross-published for Scala 2.10 and Scala 2.11, so users should replace the proper Scala version (2.10 or 2.11) in the commands listed above.</p>
 
@@ -246,13 +228,13 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <h3 id="scala-api">Scala API</h3>
 
-<pre><code>val lines = ZeroMQUtils.createStream(ssc, ...)
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>val lines = ZeroMQUtils.createStream(ssc, ...)
+</code></pre></div></div>
 
 <h3 id="java-api">Java API</h3>
 
-<pre><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>JavaDStream&lt;String&gt; lines = ZeroMQUtils.createStream(jssc, ...);
+</code></pre></div></div>
 
 <p>See end-to-end examples at <a href="https://github.com/apache/bahir/tree/master/streaming-zeromq/examples">ZeroMQ Examples</a></p>
 
diff --git a/content/docs/spark/2.1.2/documentation/index.html b/content/docs/spark/2.1.2/documentation/index.html
index a949286..e3cf87d 100644
--- a/content/docs/spark/2.1.2/documentation/index.html
+++ b/content/docs/spark/2.1.2/documentation/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
diff --git a/content/docs/spark/2.1.2/spark-sql-cloudant/index.html b/content/docs/spark/2.1.2/spark-sql-cloudant/index.html
index c5bf0fc..8e5ecd2 100644
--- a/content/docs/spark/2.1.2/spark-sql-cloudant/index.html
+++ b/content/docs/spark/2.1.2/spark-sql-cloudant/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -227,35 +209,35 @@ clusters, desktop PCs, and mobile devices.</p>
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-cloudant" % "2.1.2"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-cloudant" % "2.1.2"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-cloudant_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.2&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.</p>
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>Submit a job in Python:</p>
 
-<pre><code>spark-submit  --master local[4] --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2  &lt;path to python script&gt;
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>spark-submit  --master local[4] --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2  &lt;path to python script&gt;
+</code></pre></div></div>
 
 <p>Submit a job in Scala:</p>
 
-<pre><code>spark-submit --class "&lt;your class&gt;" --master local[4] --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2 &lt;path to spark-sql-cloudant jar&gt;
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>spark-submit --class "&lt;your class&gt;" --master local[4] --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2 &lt;path to spark-sql-cloudant jar&gt;
+</code></pre></div></div>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -288,12 +270,12 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
     <tr>
       <td>cloudant.batchInterval</td>
       <td style="text-align: center">8</td>
-      <td>number of seconds to set for streaming all documents from <code>_changes</code> endpoint into Spark dataframe.  See <a href="https://spark.apache.org/docs/latest/streaming-programming-guide.html#setting-the-right-batch-interval">Setting the right batch interval</a> for tuning this value.</td>
+      <td>number of seconds to set for streaming all documents from <code class="language-plaintext highlighter-rouge">_changes</code> endpoint into Spark dataframe.  See <a href="https://spark.apache.org/docs/latest/streaming-programming-guide.html#setting-the-right-batch-interval">Setting the right batch interval</a> for tuning this value.</td>
     </tr>
     <tr>
       <td>cloudant.endpoint</td>
-      <td style="text-align: center"><code>_all_docs</code></td>
-      <td>endpoint for RelationProvider when loading data from Cloudant to DataFrames or SQL temporary tables. Select between the Cloudant <code>_all_docs</code> or <code>_changes</code> API endpoint.  See <strong>Note</strong> below for differences between endpoints.</td>
+      <td style="text-align: center"><code class="language-plaintext highlighter-rouge">_all_docs</code></td>
+      <td>endpoint for RelationProvider when loading data from Cloudant to DataFrames or SQL temporary tables. Select between the Cloudant <code class="language-plaintext highlighter-rouge">_all_docs</code> or <code class="language-plaintext highlighter-rouge">_changes</code> API endpoint.  See <strong>Note</strong> below for differences between endpoints.</td>
     </tr>
     <tr>
       <td>cloudant.protocol</td>
@@ -318,32 +300,32 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
     <tr>
       <td>cloudant.numberOfRetries</td>
       <td style="text-align: center">3</td>
-      <td>number of times to replay a request that received a 429 <code>Too Many Requests</code> response</td>
+      <td>number of times to replay a request that received a 429 <code class="language-plaintext highlighter-rouge">Too Many Requests</code> response</td>
     </tr>
     <tr>
       <td>cloudant.useQuery</td>
       <td style="text-align: center">false</td>
-      <td>by default, <code>_all_docs</code> endpoint is used if configuration ‘view’ and ‘index’ (see below) are not set. When useQuery is enabled, <code>_find</code> endpoint will be used in place of <code>_all_docs</code> when query condition is not on primary key field (_id), so that query predicates may be driven into datastore.</td>
+      <td>by default, <code class="language-plaintext highlighter-rouge">_all_docs</code> endpoint is used if configuration ‘view’ and ‘index’ (see below) are not set. When useQuery is enabled, <code class="language-plaintext highlighter-rouge">_find</code> endpoint will be used in place of <code class="language-plaintext highlighter-rouge">_all_docs</code> when query condition is not on primary key field (_id), so that query predicates may be driven into datastore.</td>
     </tr>
     <tr>
       <td>cloudant.queryLimit</td>
       <td style="text-align: center">25</td>
-      <td>the maximum number of results returned when querying the <code>_find</code> endpoint.</td>
+      <td>the maximum number of results returned when querying the <code class="language-plaintext highlighter-rouge">_find</code> endpoint.</td>
     </tr>
     <tr>
       <td>cloudant.storageLevel</td>
       <td style="text-align: center">MEMORY_ONLY</td>
-      <td>the storage level for persisting Spark RDDs during load when <code>cloudant.endpoint</code> is set to <code>_changes</code>.  See <a href="https://spark.apache.org/docs/latest/programming-guide.html#rdd-persistence">RDD Persistence section</a> in Spark’s Progamming Guide for all available storage level options.</td>
+      <td>the storage level for persisting Spark RDDs during load when <code class="language-plaintext highlighter-rouge">cloudant.endpoint</code> is set to <code class="language-plaintext highlighter-rouge">_changes</code>.  See <a href="https://spark.apache.org/docs/latest/programming-guide.html#rdd-persistence">RDD Persistence section</a> in Spark’s Progamming Guide for all available storage level options.</td>
     </tr>
     <tr>
       <td>cloudant.timeout</td>
       <td style="text-align: center">60000</td>
-      <td>stop the response after waiting the defined number of milliseconds for data.  Only supported with <code>changes</code> endpoint.</td>
+      <td>stop the response after waiting the defined number of milliseconds for data.  Only supported with <code class="language-plaintext highlighter-rouge">changes</code> endpoint.</td>
     </tr>
     <tr>
       <td>jsonstore.rdd.partitions</td>
       <td style="text-align: center">10</td>
-      <td>the number of partitions intent used to drive JsonStoreRDD loading query result in parallel. The actual number is calculated based on total rows returned and satisfying maxInPartition and minInPartition. Only supported with <code>_all_docs</code> endpoint.</td>
+      <td>the number of partitions intent used to drive JsonStoreRDD loading query result in parallel. The actual number is calculated based on total rows returned and satisfying maxInPartition and minInPartition. Only supported with <code class="language-plaintext highlighter-rouge">_all_docs</code> endpoint.</td>
     </tr>
     <tr>
       <td>jsonstore.rdd.maxInPartition</td>
@@ -368,7 +350,7 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
     <tr>
       <td>schemaSampleSize</td>
       <td style="text-align: center">-1</td>
-      <td>the sample size for RDD schema discovery. 1 means we are using only the first document for schema discovery; -1 means all documents; 0 will be treated as 1; any number N means min(N, total) docs. Only supported with <code>_all_docs</code> endpoint.</td>
+      <td>the sample size for RDD schema discovery. 1 means we are using only the first document for schema discovery; -1 means all documents; 0 will be treated as 1; any number N means min(N, total) docs. Only supported with <code class="language-plaintext highlighter-rouge">_all_docs</code> endpoint.</td>
     </tr>
     <tr>
       <td>createDBOnSave</td>
@@ -378,27 +360,31 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
   </tbody>
 </table>
 
-<p>The <code>cloudant.endpoint</code> option sets ` _changes<code> or </code>_all_docs` API endpoint to be called while loading Cloudant data into Spark DataFrames or SQL Tables.</p>
+<p>The <code class="language-plaintext highlighter-rouge">cloudant.endpoint</code> option sets ` _changes<code class="language-plaintext highlighter-rouge"> or </code>_all_docs` API endpoint to be called while loading Cloudant data into Spark DataFrames or SQL Tables.</p>
 
-<p><strong>Note:</strong> When using <code>_changes</code> API, please consider:
-1. Results are partially ordered and may not be be presented in order in
-which documents were updated.
-2. In case of shards’ unavailability, you may see duplicate results (changes that have been seen already)
-3. Can use <code>selector</code> option to filter Cloudant docs during load
-4. Supports a real snapshot of the database and represents it in a single point of time.
-5. Only supports a single partition.</p>
+<p><strong>Note:</strong> When using <code class="language-plaintext highlighter-rouge">_changes</code> API, please consider:</p>
+<ol>
+  <li>Results are partially ordered and may not be be presented in order in
+which documents were updated.</li>
+  <li>In case of shards’ unavailability, you may see duplicate results (changes that have been seen already)</li>
+  <li>Can use <code class="language-plaintext highlighter-rouge">selector</code> option to filter Cloudant docs during load</li>
+  <li>Supports a real snapshot of the database and represents it in a single point of time.</li>
+  <li>Only supports a single partition.</li>
+</ol>
 
-<p>When using <code>_all_docs</code> API:
-1. Supports parallel reads (using offset and range) and partitioning.
-2. Using partitions may not represent the true snapshot of a database.  Some docs
-   may be added or deleted in the database between loading data into different
-   Spark partitions.</p>
+<p>When using <code class="language-plaintext highlighter-rouge">_all_docs</code> API:</p>
+<ol>
+  <li>Supports parallel reads (using offset and range) and partitioning.</li>
+  <li>Using partitions may not represent the true snapshot of a database.  Some docs
+may be added or deleted in the database between loading data into different
+Spark partitions.</li>
+</ol>
 
-<p>If loading Cloudant docs from a database greater than 100 MB, set <code>cloudant.endpoint</code> to <code>_changes</code> and <code>spark.streaming.unpersist</code> to <code>false</code>.
-This will enable RDD persistence during load against <code>_changes</code> endpoint and allow the persisted RDDs to be accessible after streaming completes.</p>
+<p>If loading Cloudant docs from a database greater than 100 MB, set <code class="language-plaintext highlighter-rouge">cloudant.endpoint</code> to <code class="language-plaintext highlighter-rouge">_changes</code> and <code class="language-plaintext highlighter-rouge">spark.streaming.unpersist</code> to <code class="language-plaintext highlighter-rouge">false</code>.
+This will enable RDD persistence during load against <code class="language-plaintext highlighter-rouge">_changes</code> endpoint and allow the persisted RDDs to be accessible after streaming completes.</p>
 
 <p>See <a href="src/test/scala/org/apache/bahir/cloudant/CloudantChangesDFSuite.scala">CloudantChangesDFSuite</a>
-for examples of loading data into a Spark DataFrame with <code>_changes</code> API.</p>
+for examples of loading data into a Spark DataFrame with <code class="language-plaintext highlighter-rouge">_changes</code> API.</p>
 
 <h3 id="configuration-on-spark-sql-temporary-table-or-dataframe">Configuration on Spark SQL Temporary Table or DataFrame</h3>
 
@@ -446,7 +432,7 @@ for examples of loading data into a Spark DataFrame with <code>_changes</code> A
     <tr>
       <td>selector</td>
       <td style="text-align: center">all documents</td>
-      <td>a selector written in Cloudant Query syntax, specifying conditions for selecting documents when the <code>cloudant.endpoint</code> option is set to <code>_changes</code>. Only documents satisfying the selector’s conditions will be retrieved from Cloudant and loaded into Spark.</td>
+      <td>a selector written in Cloudant Query syntax, specifying conditions for selecting documents when the <code class="language-plaintext highlighter-rouge">cloudant.endpoint</code> option is set to <code class="language-plaintext highlighter-rouge">_changes</code>. Only documents satisfying the selector’s conditions will be retrieved from Cloudant and loaded into Spark.</td>
     </tr>
     <tr>
       <td>view</td>
@@ -456,12 +442,11 @@ for examples of loading data into a Spark DataFrame with <code>_changes</code> A
   </tbody>
 </table>
 
-<p>For fast loading, views are loaded without include_docs. Thus, a derived schema will always be: <code>{id, key, value}</code>, where <code>value </code>can be a compount field. An example of loading data from a view:</p>
+<p>For fast loading, views are loaded without include_docs. Thus, a derived schema will always be: <code class="language-plaintext highlighter-rouge">{id, key, value}</code>, where <code class="language-plaintext highlighter-rouge">value </code>can be a compount field. An example of loading data from a view:</p>
 
-<p>```python
-spark.sql(“ CREATE TEMPORARY TABLE flightTable1 USING org.apache.bahir.cloudant OPTIONS ( database ‘n_flight’, view ‘_design/view/_view/AA0’)”)</p>
+<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">spark</span><span class="p">.</span><span class="n">sql</span><span class="p">(</span><span class="s">" CREATE TEMPORARY TABLE flightTable1 USING org.apache.bahir.cloudant OPTIONS ( database 'n_flight', view '_design/view/_view/AA0')"</span><span class="p">)</span>
 
-<p>```</p>
+</code></pre></div></div>
 
 <h3 id="configuration-on-cloudant-receiver-for-spark-streaming">Configuration on Cloudant Receiver for Spark Streaming</h3>
 
@@ -502,9 +487,9 @@ spark.sql(“ CREATE TEMPORARY TABLE flightTable1 USING org.apache.bahir.cloudan
   </tbody>
 </table>
 
-<h3 id="configuration-in-spark-submit-using---conf-option">Configuration in spark-submit using –conf option</h3>
+<h3 id="configuration-in-spark-submit-using-conf-option">Configuration in spark-submit using –conf option</h3>
 
-<p>The above stated configuration keys can also be set using <code>spark-submit --conf</code> option. When passing configuration in spark-submit, make sure adding “spark.” as prefix to the keys.</p>
+<p>The above stated configuration keys can also be set using <code class="language-plaintext highlighter-rouge">spark-submit --conf</code> option. When passing configuration in spark-submit, make sure adding “spark.” as prefix to the keys.</p>
 
 <h2 id="examples">Examples</h2>
 
@@ -512,61 +497,58 @@ spark.sql(“ CREATE TEMPORARY TABLE flightTable1 USING org.apache.bahir.cloudan
 
 <h4 id="using-sql-in-python">Using SQL In Python</h4>
 
-<p>```python
-spark = SparkSession\
-    .builder\
-    .appName(“Cloudant Spark SQL Example in Python using temp tables”)\
-    .config(“cloudant.host”,”ACCOUNT.cloudant.com”)\
-    .config(“cloudant.username”, “USERNAME”)\
-    .config(“cloudant.password”,”PASSWORD”)\
-    .getOrCreate()</p>
-
-<h1 id="loading-temp-table-from-cloudant-db">Loading temp table from Cloudant db</h1>
-<p>spark.sql(“ CREATE TEMPORARY TABLE airportTable USING org.apache.bahir.cloudant OPTIONS ( database ‘n_airportcodemapping’)”)
-airportData = spark.sql(“SELECT _id, airportName FROM airportTable WHERE _id &gt;= ‘CAA’ AND _id &lt;= ‘GAA’ ORDER BY _id”)
-airportData.printSchema()
-print ‘Total # of rows in airportData: ‘ + str(airportData.count())
-for code in airportData.collect():
-    print code._id
-```</p>
+<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">spark</span> <span class="o">=</span> <span class="n">SparkSession</span>\
+    <span class="p">.</span><span class="n">builder</span>\
+    <span class="p">.</span><span class="n">appName</span><span class="p">(</span><span class="s">"Cloudant Spark SQL Example in Python using temp tables"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.host"</span><span class="p">,</span><span class="s">"ACCOUNT.cloudant.com"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.username"</span><span class="p">,</span> <span class="s">"USERNAME"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.password"</span><span class="p">,</span><span class="s">"PASSWORD"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">getOrCreate</span><span class="p">()</span>
+
+
+<span class="c1"># Loading temp table from Cloudant db
+</span><span class="n">spark</span><span class="p">.</span><span class="n">sql</span><span class="p">(</span><span class="s">" CREATE TEMPORARY TABLE airportTable USING org.apache.bahir.cloudant OPTIONS ( database 'n_airportcodemapping')"</span><span class="p">)</span>
+<span class="n">airportData</span> <span class="o">=</span> <span class="n">spark</span><span class="p">.</span><span class="n">sql</span><span class="p">(</span><span class="s">"SELECT _id, airportName FROM airportTable WHERE _id &gt;= 'CAA' AND _id &lt;= 'GAA' ORDER BY _id"</span><span class="p">)</span>
+<span class="n">airportData</span><span class="p">.</span><span class="n">printSchema</span><span class="p">()</span>
+<span class="k">print</span> <span class="s">'Total # of rows in airportData: '</span> <span class="o">+</span> <span class="nb">str</span><span class="p">(</span><span class="n">airportData</span><span class="p">.</span><span class="n">count</span><span class="p">())</span>
+<span class="k">for</span> <span class="n">code</span> <span class="ow">in</span> <span class="n">airportData</span><span class="p">.</span><span class="n">collect</span><span class="p">():</span>
+    <span class="k">print</span> <span class="n">code</span><span class="p">.</span><span class="n">_id</span>
+</code></pre></div></div>
 
 <p>See <a href="examples/python/CloudantApp.py">CloudantApp.py</a> for examples.</p>
 
-<p>Submit job example:
-<code>
-spark-submit  --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2 --conf spark.cloudant.host=ACCOUNT.cloudant.com --conf spark.cloudant.username=USERNAME --conf spark.cloudant.password=PASSWORD sql-cloudant/examples/python/CloudantApp.py
-</code></p>
+<p>Submit job example:</p>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>spark-submit  --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2 --conf spark.cloudant.host=ACCOUNT.cloudant.com --conf spark.cloudant.username=USERNAME --conf spark.cloudant.password=PASSWORD sql-cloudant/examples/python/CloudantApp.py
+</code></pre></div></div>
 
 <h4 id="using-dataframe-in-python">Using DataFrame In Python</h4>
 
-<p>```python
-spark = SparkSession\
-    .builder\
-    .appName(“Cloudant Spark SQL Example in Python using dataframes”)\
-    .config(“cloudant.host”,”ACCOUNT.cloudant.com”)\
-    .config(“cloudant.username”, “USERNAME”)\
-    .config(“cloudant.password”,”PASSWORD”)\
-    .config(“jsonstore.rdd.partitions”, 8)\
-    .getOrCreate()</p>
-
-<h1 id="loading-dataframe-from-cloudant-db">***1. Loading dataframe from Cloudant db</h1>
-<p>df = spark.read.load(“n_airportcodemapping”, “org.apache.bahir.cloudant”)
-df.cache()
-df.printSchema()
-df.filter(df.airportName &gt;= ‘Moscow’).select(“_id”,’airportName’).show()
-df.filter(df._id &gt;= ‘CAA’).select(“_id”,’airportName’).show()	  <br />
-```</p>
+<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">spark</span> <span class="o">=</span> <span class="n">SparkSession</span>\
+    <span class="p">.</span><span class="n">builder</span>\
+    <span class="p">.</span><span class="n">appName</span><span class="p">(</span><span class="s">"Cloudant Spark SQL Example in Python using dataframes"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.host"</span><span class="p">,</span><span class="s">"ACCOUNT.cloudant.com"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.username"</span><span class="p">,</span> <span class="s">"USERNAME"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"cloudant.password"</span><span class="p">,</span><span class="s">"PASSWORD"</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">config</span><span class="p">(</span><span class="s">"jsonstore.rdd.partitions"</span><span class="p">,</span> <span class="mi">8</span><span class="p">)</span>\
+    <span class="p">.</span><span class="n">getOrCreate</span><span class="p">()</span>
+
+<span class="c1"># ***1. Loading dataframe from Cloudant db
+</span><span class="n">df</span> <span class="o">=</span> <span class="n">spark</span><span class="p">.</span><span class="n">read</span><span class="p">.</span><span class="n">load</span><span class="p">(</span><span class="s">"n_airportcodemapping"</span><span class="p">,</span> <span class="s">"org.apache.bahir.cloudant"</span><span class="p">)</span>
+<span class="n">df</span><span class="p">.</span><span class="n">cache</span><span class="p">()</span>
+<span class="n">df</span><span class="p">.</span><span class="n">printSchema</span><span class="p">()</span>
+<span class="n">df</span><span class="p">.</span><span class="nb">filter</span><span class="p">(</span><span class="n">df</span><span class="p">.</span><span class="n">airportName</span> <span class="o">&gt;=</span> <span class="s">'Moscow'</span><span class="p">).</span><span class="n">select</span><span class="p">(</span><span class="s">"_id"</span><span class="p">,</span><span class="s">'airportName'</span><span class="p">).</span><span class="n">show</span><span class="p">()</span>
+<span class="n">df</span><span class="p">.</span><span class="nb">filter</span><span class="p">(</span><span class="n">df</span><span class="p">.</span><span class="n">_id</span> <span class="o">&gt;=</span> <span class="s">'CAA'</span><span class="p">).</span><span class="n">select</span><span class="p">(</span><span class="s">"_id"</span><span class="p">,</span><span class="s">'airportName'</span><span class="p">).</span><span class="n">show</span><span class="p">()</span>	    
+</code></pre></div></div>
 
 <p>See <a href="examples/python/CloudantDF.py">CloudantDF.py</a> for examples.</p>
 
 <p>In case of doing multiple operations on a dataframe (select, filter etc.),
 you should persist a dataframe. Otherwise, every operation on a dataframe will load the same data from Cloudant again.
-Persisting will also speed up computation. This statement will persist an RDD in memory: <code>df.cache()</code>.  Alternatively for large dbs to persist in memory &amp; disk, use:</p>
+Persisting will also speed up computation. This statement will persist an RDD in memory: <code class="language-plaintext highlighter-rouge">df.cache()</code>.  Alternatively for large dbs to persist in memory &amp; disk, use:</p>
 
-<p><code>python
-from pyspark import StorageLevel
-df.persist(storageLevel = StorageLevel(True, True, False, True, 1))
-</code></p>
+<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">from</span> <span class="nn">pyspark</span> <span class="kn">import</span> <span class="n">StorageLevel</span>
+<span class="n">df</span><span class="p">.</span><span class="n">persist</span><span class="p">(</span><span class="n">storageLevel</span> <span class="o">=</span> <span class="n">StorageLevel</span><span class="p">(</span><span class="bp">True</span><span class="p">,</span> <span class="bp">True</span><span class="p">,</span> <span class="bp">False</span><span class="p">,</span> <span class="bp">True</span><span class="p">,</span> <span class="mi">1</span><span class="p">))</span>
+</code></pre></div></div>
 
 <p><a href="examples/python/CloudantDFOption.py">Sample code</a> on using DataFrame option to define cloudant configuration</p>
 
@@ -574,65 +556,62 @@ df.persist(storageLevel = StorageLevel(True, True, False, True, 1))
 
 <h4 id="using-sql-in-scala">Using SQL In Scala</h4>
 
-<p>```scala
-val spark = SparkSession
-      .builder()
-      .appName(“Cloudant Spark SQL Example”)
-      .config(“cloudant.host”,”ACCOUNT.cloudant.com”)
-      .config(“cloudant.username”, “USERNAME”)
-      .config(“cloudant.password”,”PASSWORD”)
-      .getOrCreate()</p>
-
-<p>// For implicit conversions of Dataframe to RDDs
-import spark.implicits._</p>
-
-<p>// create a temp table from Cloudant db and query it using sql syntax
-spark.sql(
-    s”””
+<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">spark</span> <span class="k">=</span> <span class="nc">SparkSession</span>
+      <span class="o">.</span><span class="py">builder</span><span class="o">()</span>
+      <span class="o">.</span><span class="py">appName</span><span class="o">(</span><span class="s">"Cloudant Spark SQL Example"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.host"</span><span class="o">,</span><span class="s">"ACCOUNT.cloudant.com"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.username"</span><span class="o">,</span> <span class="s">"USERNAME"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.password"</span><span class="o">,</span><span class="s">"PASSWORD"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">getOrCreate</span><span class="o">()</span>
+
+<span class="c1">// For implicit conversions of Dataframe to RDDs</span>
+<span class="k">import</span> <span class="nn">spark.implicits._</span>
+
+<span class="c1">// create a temp table from Cloudant db and query it using sql syntax</span>
+<span class="nv">spark</span><span class="o">.</span><span class="py">sql</span><span class="o">(</span>
+    <span class="n">s</span><span class="s">"""
     |CREATE TEMPORARY TABLE airportTable
     |USING org.apache.bahir.cloudant
-    |OPTIONS ( database ‘n_airportcodemapping’)
-    “”“.stripMargin)
-// create a dataframe
-val airportData = spark.sql(“SELECT _id, airportName FROM airportTable WHERE _id &gt;= ‘CAA’ AND _id &lt;= ‘GAA’ ORDER BY _id”)
-airportData.printSchema()
-println(s”Total # of rows in airportData: “ + airportData.count())
-// convert dataframe to array of Rows, and process each row
-airportData.map(t =&gt; “code: “ + t(0) + “,name:” + t(1)).collect().foreach(println)
-```
-See <a href="examples/scala/src/main/scala/mytest/spark/CloudantApp.scala">CloudantApp.scala</a> for examples.</p>
-
-<p>Submit job example:
-<code>
-spark-submit --class org.apache.spark.examples.sql.cloudant.CloudantApp --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2 --conf spark.cloudant.host=ACCOUNT.cloudant.com --conf spark.cloudant.username=USERNAME --conf spark.cloudant.password=PASSWORD  /path/to/spark-sql-cloudant_2.11-2.1.2-tests.jar
-</code></p>
+    |OPTIONS ( database 'n_airportcodemapping')
+    """</span><span class="o">.</span><span class="py">stripMargin</span><span class="o">)</span>
+<span class="c1">// create a dataframe</span>
+<span class="k">val</span> <span class="nv">airportData</span> <span class="k">=</span> <span class="nv">spark</span><span class="o">.</span><span class="py">sql</span><span class="o">(</span><span class="s">"SELECT _id, airportName FROM airportTable WHERE _id &gt;= 'CAA' AND _id &lt;= 'GAA' ORDER BY _id"</span><span class="o">)</span>
+<span class="nv">airportData</span><span class="o">.</span><span class="py">printSchema</span><span class="o">()</span>
+<span class="nf">println</span><span class="o">(</span><span class="n">s</span><span class="s">"Total # of rows in airportData: "</span> <span class="o">+</span> <span class="nv">airportData</span><span class="o">.</span><span class="py">count</span><span class="o">())</span>
+<span class="c1">// convert dataframe to array of Rows, and process each row</span>
+<span class="nv">airportData</span><span class="o">.</span><span class="py">map</span><span class="o">(</span><span class="n">t</span> <span class="k">=&gt;</span> <span class="s">"code: "</span> <span class="o">+</span> <span class="nf">t</span><span class="o">(</span><span class="mi">0</span><span class="o">)</span> <span class="o">+</span> <span class="s">",name:"</span> <span class="o">+</span> <span class="nf">t</span><span class="o">(</span><span class="mi">1</span><span class="o"> [...]
+</code></pre></div></div>
+<p>See <a href="examples/scala/src/main/scala/mytest/spark/CloudantApp.scala">CloudantApp.scala</a> for examples.</p>
+
+<p>Submit job example:</p>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>spark-submit --class org.apache.spark.examples.sql.cloudant.CloudantApp --packages org.apache.bahir:spark-sql-cloudant_2.11:2.1.2 --conf spark.cloudant.host=ACCOUNT.cloudant.com --conf spark.cloudant.username=USERNAME --conf spark.cloudant.password=PASSWORD  /path/to/spark-sql-cloudant_2.11-2.1.2-tests.jar
+</code></pre></div></div>
 
 <h3 id="using-dataframe-in-scala">Using DataFrame In Scala</h3>
 
-<p>```scala
-val spark = SparkSession
-      .builder()
-      .appName(“Cloudant Spark SQL Example with Dataframe”)
-      .config(“cloudant.host”,”ACCOUNT.cloudant.com”)
-      .config(“cloudant.username”, “USERNAME”)
-      .config(“cloudant.password”,”PASSWORD”)
-      .config(“createDBOnSave”,”true”) // to create a db on save
-      .config(“jsonstore.rdd.partitions”, “20”) // using 20 partitions
-      .getOrCreate()</p>
-
-<p>// 1. Loading data from Cloudant db
-val df = spark.read.format(“org.apache.bahir.cloudant”).load(“n_flight”)
-// Caching df in memory to speed computations
-// and not to retrieve data from cloudant again
-df.cache()
-df.printSchema()</p>
-
-<p>// 2. Saving dataframe to Cloudant db
-val df2 = df.filter(df(“flightSegmentId”) === “AA106”)
-    .select(“flightSegmentId”,”economyClassBaseCost”)
-df2.show()
-df2.write.format(“org.apache.bahir.cloudant”).save(“n_flight2”)
-```</p>
+<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">spark</span> <span class="k">=</span> <span class="nc">SparkSession</span>
+      <span class="o">.</span><span class="py">builder</span><span class="o">()</span>
+      <span class="o">.</span><span class="py">appName</span><span class="o">(</span><span class="s">"Cloudant Spark SQL Example with Dataframe"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.host"</span><span class="o">,</span><span class="s">"ACCOUNT.cloudant.com"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.username"</span><span class="o">,</span> <span class="s">"USERNAME"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"cloudant.password"</span><span class="o">,</span><span class="s">"PASSWORD"</span><span class="o">)</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"createDBOnSave"</span><span class="o">,</span><span class="s">"true"</span><span class="o">)</span> <span class="c1">// to create a db on save</span>
+      <span class="o">.</span><span class="py">config</span><span class="o">(</span><span class="s">"jsonstore.rdd.partitions"</span><span class="o">,</span> <span class="s">"20"</span><span class="o">)</span> <span class="c1">// using 20 partitions</span>
+      <span class="o">.</span><span class="py">getOrCreate</span><span class="o">()</span>
+
+<span class="c1">// 1. Loading data from Cloudant db</span>
+<span class="k">val</span> <span class="nv">df</span> <span class="k">=</span> <span class="nv">spark</span><span class="o">.</span><span class="py">read</span><span class="o">.</span><span class="py">format</span><span class="o">(</span><span class="s">"org.apache.bahir.cloudant"</span><span class="o">).</span><span class="py">load</span><span class="o">(</span><span class="s">"n_flight"</span><span class="o">)</span>
+<span class="c1">// Caching df in memory to speed computations</span>
+<span class="c1">// and not to retrieve data from cloudant again</span>
+<span class="nv">df</span><span class="o">.</span><span class="py">cache</span><span class="o">()</span>
+<span class="nv">df</span><span class="o">.</span><span class="py">printSchema</span><span class="o">()</span>
+
+<span class="c1">// 2. Saving dataframe to Cloudant db</span>
+<span class="k">val</span> <span class="nv">df2</span> <span class="k">=</span> <span class="nv">df</span><span class="o">.</span><span class="py">filter</span><span class="o">(</span><span class="nf">df</span><span class="o">(</span><span class="s">"flightSegmentId"</span><span class="o">)</span> <span class="o">===</span> <span class="s">"AA106"</span><span class="o">)</span>
+    <span class="o">.</span><span class="py">select</span><span class="o">(</span><span class="s">"flightSegmentId"</span><span class="o">,</span><span class="s">"economyClassBaseCost"</span><span class="o">)</span>
+<span class="nv">df2</span><span class="o">.</span><span class="py">show</span><span class="o">()</span>
+<span class="nv">df2</span><span class="o">.</span><span class="py">write</span><span class="o">.</span><span class="py">format</span><span class="o">(</span><span class="s">"org.apache.bahir.cloudant"</span><span class="o">).</span><span class="py">save</span><span class="o">(</span><span class="s">"n_flight2"</span><span class="o">)</span>
+</code></pre></div></div>
 
 <p>See <a href="examples/scala/src/main/scala/mytest/spark/CloudantDF.scala">CloudantDF.scala</a> for examples.</p>
 
@@ -640,49 +619,47 @@ df2.write.format(“org.apache.bahir.cloudant”).save(“n_flight2”)
 
 <h3 id="using-streams-in-scala">Using Streams In Scala</h3>
 
-<p>```scala
-val ssc = new StreamingContext(sparkConf, Seconds(10))
-val changes = ssc.receiverStream(new CloudantReceiver(Map(
-  “cloudant.host” -&gt; “ACCOUNT.cloudant.com”,
-  “cloudant.username” -&gt; “USERNAME”,
-  “cloudant.password” -&gt; “PASSWORD”,
-  “database” -&gt; “n_airportcodemapping”)))</p>
-
-<p>changes.foreachRDD((rdd: RDD[String], time: Time) =&gt; {
-  // Get the singleton instance of SparkSession
-  val spark = SparkSessionSingleton.getInstance(rdd.sparkContext.getConf)</p>
-
-<p>println(s”========= $time =========”)
-  // Convert RDD[String] to DataFrame
-  val changesDataFrame = spark.read.json(rdd)
-  if (!changesDataFrame.schema.isEmpty) {
-    changesDataFrame.printSchema()
-    changesDataFrame.select(“*”).show()
-    ….
-  }
-})
-ssc.start()
-// run streaming for 120 secs
-Thread.sleep(120000L)
-ssc.stop(true)</p>
-
-<p>```</p>
+<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">ssc</span> <span class="k">=</span> <span class="k">new</span> <span class="nc">StreamingContext</span><span class="o">(</span><span class="n">sparkConf</span><span class="o">,</span> <span class="nc">Seconds</span><span class="o">(</span><span class="mi">10</span><span class="o">))</span>
+<span class="k">val</span> <span class="nv">changes</span> <span class="k">=</span> <span class="nv">ssc</span><span class="o">.</span><span class="py">receiverStream</span><span class="o">(</span><span class="k">new</span> <span class="nc">CloudantReceiver</span><span class="o">(</span><span class="nc">Map</span><span class="o">(</span>
+  <span class="s">"cloudant.host"</span> <span class="o">-&gt;</span> <span class="s">"ACCOUNT.cloudant.com"</span><span class="o">,</span>
+  <span class="s">"cloudant.username"</span> <span class="o">-&gt;</span> <span class="s">"USERNAME"</span><span class="o">,</span>
+  <span class="s">"cloudant.password"</span> <span class="o">-&gt;</span> <span class="s">"PASSWORD"</span><span class="o">,</span>
+  <span class="s">"database"</span> <span class="o">-&gt;</span> <span class="s">"n_airportcodemapping"</span><span class="o">)))</span>
+
+<span class="nv">changes</span><span class="o">.</span><span class="py">foreachRDD</span><span class="o">((</span><span class="n">rdd</span><span class="k">:</span> <span class="kt">RDD</span><span class="o">[</span><span class="kt">String</span><span class="o">],</span> <span class="n">time</span><span class="k">:</span> <span class="kt">Time</span><span class="o">)</span> <span class="k">=&gt;</span> <span class="o">{</span>
+  <span class="c1">// Get the singleton instance of SparkSession</span>
+  <span class="k">val</span> <span class="nv">spark</span> <span class="k">=</span> <span class="nv">SparkSessionSingleton</span><span class="o">.</span><span class="py">getInstance</span><span class="o">(</span><span class="nv">rdd</span><span class="o">.</span><span class="py">sparkContext</span><span class="o">.</span><span class="py">getConf</span><span class="o">)</span>
+
+  <span class="nf">println</span><span class="o">(</span><span class="n">s</span><span class="s">"========= $time ========="</span><span class="o">)</span>
+  <span class="c1">// Convert RDD[String] to DataFrame</span>
+  <span class="k">val</span> <span class="nv">changesDataFrame</span> <span class="k">=</span> <span class="nv">spark</span><span class="o">.</span><span class="py">read</span><span class="o">.</span><span class="py">json</span><span class="o">(</span><span class="n">rdd</span><span class="o">)</span>
+  <span class="nf">if</span> <span class="o">(!</span><span class="nv">changesDataFrame</span><span class="o">.</span><span class="py">schema</span><span class="o">.</span><span class="py">isEmpty</span><span class="o">)</span> <span class="o">{</span>
+    <span class="nv">changesDataFrame</span><span class="o">.</span><span class="py">printSchema</span><span class="o">()</span>
+    <span class="nv">changesDataFrame</span><span class="o">.</span><span class="py">select</span><span class="o">(</span><span class="s">"*"</span><span class="o">).</span><span class="py">show</span><span class="o">()</span>
+    <span class="o">....</span>
+  <span class="o">}</span>
+<span class="o">})</span>
+<span class="nv">ssc</span><span class="o">.</span><span class="py">start</span><span class="o">()</span>
+<span class="c1">// run streaming for 120 secs</span>
+<span class="nv">Thread</span><span class="o">.</span><span class="py">sleep</span><span class="o">(</span><span class="mi">120000L</span><span class="o">)</span>
+<span class="nv">ssc</span><span class="o">.</span><span class="py">stop</span><span class="o">(</span><span class="kc">true</span><span class="o">)</span>
+
+</code></pre></div></div>
 
 <p>See <a href="examples/scala/src/main/scala/mytest/spark/CloudantStreaming.scala">CloudantStreaming.scala</a> for examples.</p>
 
 <p>By default, Spark Streaming will load all documents from a database. If you want to limit the loading to
-specific documents, use <code>selector</code> option of <code>CloudantReceiver</code> and specify your conditions
+specific documents, use <code class="language-plaintext highlighter-rouge">selector</code> option of <code class="language-plaintext highlighter-rouge">CloudantReceiver</code> and specify your conditions
 (See <a href="examples/scala/src/main/scala/mytest/spark/CloudantStreamingSelector.scala">CloudantStreamingSelector.scala</a>
 example for more details):</p>
 
-<p><code>scala
-val changes = ssc.receiverStream(new CloudantReceiver(Map(
-  "cloudant.host" -&gt; "ACCOUNT.cloudant.com",
-  "cloudant.username" -&gt; "USERNAME",
-  "cloudant.password" -&gt; "PASSWORD",
-  "database" -&gt; "sales",
-  "selector" -&gt; "{\"month\":\"May\", \"rep\":\"John\"}")))
-</code></p>
+<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">changes</span> <span class="k">=</span> <span class="nv">ssc</span><span class="o">.</span><span class="py">receiverStream</span><span class="o">(</span><span class="k">new</span> <span class="nc">CloudantReceiver</span><span class="o">(</span><span class="nc">Map</span><span class="o">(</span>
+  <span class="s">"cloudant.host"</span> <span class="o">-&gt;</span> <span class="s">"ACCOUNT.cloudant.com"</span><span class="o">,</span>
+  <span class="s">"cloudant.username"</span> <span class="o">-&gt;</span> <span class="s">"USERNAME"</span><span class="o">,</span>
+  <span class="s">"cloudant.password"</span> <span class="o">-&gt;</span> <span class="s">"PASSWORD"</span><span class="o">,</span>
+  <span class="s">"database"</span> <span class="o">-&gt;</span> <span class="s">"sales"</span><span class="o">,</span>
+  <span class="s">"selector"</span> <span class="o">-&gt;</span> <span class="s">"{\"month\":\"May\", \"rep\":\"John\"}"</span><span class="o">)))</span>
+</code></pre></div></div>
 
   </div>
 </div>
diff --git a/content/docs/spark/2.1.2/spark-sql-streaming-akka/index.html b/content/docs/spark/2.1.2/spark-sql-streaming-akka/index.html
index 7139e4b..7844057 100644
--- a/content/docs/spark/2.1.2/spark-sql-streaming-akka/index.html
+++ b/content/docs/spark/2.1.2/spark-sql-streaming-akka/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-akka" % "2.1.2"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-akka" % "2.1.2"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-streaming-akka_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.2&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-akka_2.11:2.1.2
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-akka_2.11:2.1.2
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -246,37 +228,37 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <p>A SQL Stream can be created with data streams received from Akka Feeder actor using,</p>
 
-<pre><code>    sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>    sqlContext.readStream
             .format("org.apache.bahir.sql.streaming.akka.AkkaStreamSourceProvider")
             .option("urlOfPublisher", "feederActorUri")
             .load()
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="enable-recovering-from-failures">Enable recovering from failures.</h2>
 
-<p>Setting values for option <code>persistenceDirPath</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
+<p>Setting values for option <code class="language-plaintext highlighter-rouge">persistenceDirPath</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
 
-<pre><code>    sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>    sqlContext.readStream
             .format("org.apache.bahir.sql.streaming.akka.AkkaStreamSourceProvider")
             .option("urlOfPublisher", "feederActorUri")
             .option("persistenceDirPath", "/path/to/localdir")
             .load() 
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="configuration-options">Configuration options.</h2>
 
 <p>This source uses <a href="http://doc.akka.io/api/akka/2.4/akka/actor/Actor.html">Akka Actor api</a>.</p>
 
 <ul>
-  <li><code>urlOfPublisher</code> The url of Publisher or Feeder actor that the Receiver actor connects to. Set this as the tcp url of the Publisher or Feeder actor.</li>
-  <li><code>persistenceDirPath</code> By default it is used for storing incoming messages on disk.</li>
+  <li><code class="language-plaintext highlighter-rouge">urlOfPublisher</code> The url of Publisher or Feeder actor that the Receiver actor connects to. Set this as the tcp url of the Publisher or Feeder actor.</li>
+  <li><code class="language-plaintext highlighter-rouge">persistenceDirPath</code> By default it is used for storing incoming messages on disk.</li>
 </ul>
 
 <h3 id="scala-api">Scala API</h3>
 
 <p>An example, for scala API to count words from incoming message stream.</p>
 
-<pre><code>    // Create DataFrame representing the stream of input lines from connection
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>    // Create DataFrame representing the stream of input lines from connection
     // to publisher or feeder actor
     val lines = spark.readStream
                 .format("org.apache.bahir.sql.streaming.akka.AkkaStreamSourceProvider")
@@ -296,15 +278,15 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
                 .start()
 
     query.awaitTermination()
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>AkkaStreamWordCount.scala</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">AkkaStreamWordCount.scala</code> for full example.</p>
 
 <h3 id="java-api">Java API</h3>
 
 <p>An example, for Java API to count words from incoming message stream.</p>
 
-<pre><code>    // Create DataFrame representing the stream of input lines from connection
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>    // Create DataFrame representing the stream of input lines from connection
     // to publisher or feeder actor
     Dataset&lt;String&gt; lines = spark
                             .readStream()
@@ -330,9 +312,9 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
                             .start();
 
     query.awaitTermination();   
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>JavaAkkaStreamWordCount.java</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">JavaAkkaStreamWordCount.java</code> for full example.</p>
 
   </div>
 </div>
diff --git a/content/docs/spark/2.1.2/spark-sql-streaming-mqtt/index.html b/content/docs/spark/2.1.2/spark-sql-streaming-mqtt/index.html
index d71815a..36e5f0a 100644
--- a/content/docs/spark/2.1.2/spark-sql-streaming-mqtt/index.html
+++ b/content/docs/spark/2.1.2/spark-sql-streaming-mqtt/index.html
@@ -65,11 +65,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/downloads/spark" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/downloads/flink" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -85,27 +83,21 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/community" target="_self">Get Involved</a></li>
               
               
-              
               <li><a href="/contributing" target="_self">Contributing</a></li>
               
               
-              
               <li><a href="/contributing-extensions" target="_self">Contributing Extensions</a></li>
               
               
-              
               <li><a href="https://issues.apache.org/jira/browse/BAHIR" target="_blank">Issue Tracker</a></li>
               
               
-              
               <li><a href="/community#source-code" target="_self">Source Code</a></li>
               
               
-              
               <li><a href="/community-members" target="_self">Project Committers</a></li>
               
             </ul>
@@ -121,11 +113,9 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="/docs/spark/overview" target="_self">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="/docs/flink/overview" target="_self">Bahir Flink Extensions</a></li>
               
             </ul>
@@ -141,15 +131,12 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="https://github.com/apache/bahir" target="_blank">Bahir Spark Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-flink" target="_blank">Bahir Flink Extensions</a></li>
               
               
-              
               <li><a href="https://github.com/apache/bahir-website" target="_blank">Bahir Website</a></li>
               
             </ul>
@@ -165,23 +152,18 @@
             <ul class="dropdown-menu dropdown-left">
               
               
-              
               <li><a href="http://www.apache.org/foundation/how-it-works.html" target="_blank">Apache Software Foundation</a></li>
               
               
-              
               <li><a href="http://www.apache.org/licenses/" target="_blank">Apache License</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/sponsorship" target="_blank">Sponsorship</a></li>
               
               
-              
               <li><a href="http://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a></li>
               
               
-              
               <li><a href="/privacy-policy" target="_self">Privacy Policy</a></li>
               
             </ul>
@@ -219,26 +201,26 @@
 
 <p>Using SBT:</p>
 
-<pre><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.1.2"
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.1.2"
+</code></pre></div></div>
 
 <p>Using Maven:</p>
 
-<pre><code>&lt;dependency&gt;
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;dependency&gt;
     &lt;groupId&gt;org.apache.bahir&lt;/groupId&gt;
     &lt;artifactId&gt;spark-sql-streaming-mqtt_2.11&lt;/artifactId&gt;
     &lt;version&gt;2.1.2&lt;/version&gt;
 &lt;/dependency&gt;
-</code></pre>
+</code></pre></div></div>
 
-<p>This library can also be added to Spark jobs launched through <code>spark-shell</code> or <code>spark-submit</code> by using the <code>--packages</code> command line option.
+<p>This library can also be added to Spark jobs launched through <code class="language-plaintext highlighter-rouge">spark-shell</code> or <code class="language-plaintext highlighter-rouge">spark-submit</code> by using the <code class="language-plaintext highlighter-rouge">--packages</code> command line option.
 For example, to include it when starting the spark shell:</p>
 
-<pre><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.1.2
-</code></pre>
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ bin/spark-shell --packages org.apache.bahir:spark-sql-streaming-mqtt_2.11:2.1.2
+</code></pre></div></div>
 
-<p>Unlike using <code>--jars</code>, using <code>--packages</code> ensures that this library and its dependencies will be added to the classpath.
-The <code>--packages</code> argument can also be used with <code>bin/spark-submit</code>.</p>
+<p>Unlike using <code class="language-plaintext highlighter-rouge">--jars</code>, using <code class="language-plaintext highlighter-rouge">--packages</code> ensures that this library and its dependencies will be added to the classpath.
+The <code class="language-plaintext highlighter-rouge">--packages</code> argument can also be used with <code class="language-plaintext highlighter-rouge">bin/spark-submit</code>.</p>
 
 <p>This library is compiled for Scala 2.11 only, and intends to support Spark 2.0 onwards.</p>
 
@@ -246,47 +228,47 @@ The <code>--packages</code> argument can also be used with <code>bin/spark-submi
 
 <p>A SQL Stream can be created with data streams received through MQTT Server using,</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="enable-recovering-from-failures">Enable recovering from failures.</h2>
 
-<p>Setting values for option <code>localStorage</code> and <code>clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
+<p>Setting values for option <code class="language-plaintext highlighter-rouge">localStorage</code> and <code class="language-plaintext highlighter-rouge">clientId</code> helps in recovering in case of a restart, by restoring the state where it left off before the shutdown.</p>
 
-<pre><code>sqlContext.readStream
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sqlContext.readStream
     .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
     .option("topic", "mytopic")
     .option("localStorage", "/path/to/localdir")
     .option("clientId", "some-client-id")
     .load("tcp://localhost:1883")
-</code></pre>
+</code></pre></div></div>
 
 <h2 id="configuration-options">Configuration options.</h2>
 
 <p>This source uses <a href="https://eclipse.org/paho/clients/java/">Eclipse Paho Java Client</a>. Client API documentation is located <a href="http://www.eclipse.org/paho/files/javadoc/index.html">here</a>.</p>
 
 <ul>
-  <li><code>brokerUrl</code> A url MqttClient connects to. Set this or <code>path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
-  <li><code>persistence</code> By default it is used for storing incoming messages on disk. If <code>memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
-  <li><code>topic</code> Topic MqttClient subscribes to.</li>
-  <li><code>clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
-  <li><code>QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
-  <li><code>username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
-  <li><code>password</code> Sets the password to use for the connection.</li>
-  <li><code>cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
-  <li><code>connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code>MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
-  <li><code>keepAlive</code> Same as <code>MqttConnectOptions.setKeepAliveInterval</code>.</li>
-  <li><code>mqttVersion</code> Same as <code>MqttConnectOptions.setMqttVersion</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">brokerUrl</code> A url MqttClient connects to. Set this or <code class="language-plaintext highlighter-rouge">path</code> as the url of the Mqtt Server. e.g. tcp://localhost:1883.</li>
+  <li><code class="language-plaintext highlighter-rouge">persistence</code> By default it is used for storing incoming messages on disk. If <code class="language-plaintext highlighter-rouge">memory</code> is provided as value for this option, then recovery on restart is not supported.</li>
+  <li><code class="language-plaintext highlighter-rouge">topic</code> Topic MqttClient subscribes to.</li>
+  <li><code class="language-plaintext highlighter-rouge">clientId</code> clientId, this client is assoicated with. Provide the same value to recover a stopped client.</li>
+  <li><code class="language-plaintext highlighter-rouge">QoS</code> The maximum quality of service to subscribe each topic at. Messages published at a lower quality of service will be received at the published QoS. Messages published at a higher quality of service will be received using the QoS specified on the subscribe.</li>
+  <li><code class="language-plaintext highlighter-rouge">username</code> Sets the user name to use for the connection to Mqtt Server. Do not set it, if server does not need this. Setting it empty will lead to errors.</li>
+  <li><code class="language-plaintext highlighter-rouge">password</code> Sets the password to use for the connection.</li>
+  <li><code class="language-plaintext highlighter-rouge">cleanSession</code> Setting it true starts a clean session, removes all checkpointed messages by a previous run of this source. This is set to false by default.</li>
+  <li><code class="language-plaintext highlighter-rouge">connectionTimeout</code> Sets the connection timeout, a value of 0 is interpretted as wait until client connects. See <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setConnectionTimeout</code> for more information.</li>
+  <li><code class="language-plaintext highlighter-rouge">keepAlive</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setKeepAliveInterval</code>.</li>
+  <li><code class="language-plaintext highlighter-rouge">mqttVersion</code> Same as <code class="language-plaintext highlighter-rouge">MqttConnectOptions.setMqttVersion</code>.</li>
 </ul>
 
 <h3 id="scala-api">Scala API</h3>
 
 <p>An example, for scala API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server
 val lines = spark.readStream
   .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
   .option("topic", topic)
@@ -305,15 +287,15 @@ val query = wordCounts.writeStream
   .start()
 
 query.awaitTermination()
-</code></pre>
+</code></pre></div></div>
 
-<p>Please see <code>MQTTStreamWordCount.scala</code> for full example.</p>
+<p>Please see <code class="language-plaintext highlighter-rouge">MQTTStreamWordCount.scala</code> for full example.</p>
 
 <h3 id="java-api">Java API</h3>
 
 <p>An example, for Java API to count words from incoming message stream.</p>
 
-<pre><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
+<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Create DataFrame representing the stream of input lines from connection to mqtt server.
 Dataset&lt;String&gt; lines = spark
         .readStream()
         .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
@@ -338,9 +320,9 @@ StreamingQuery query = wordCounts.writeStream()
         .start();
 
... 29637 lines suppressed ...