You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@storm.apache.org by pt...@apache.org on 2015/06/04 04:05:43 UTC

[01/50] [abbrv] storm git commit: prep for 0.2.2 release

Repository: storm
Updated Branches:
  refs/heads/0.10.x-branch efdf94966 -> b26429621


prep for 0.2.2 release


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/14970de2
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/14970de2
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/14970de2

Branch: refs/heads/0.10.x-branch
Commit: 14970de2091485f4ca777606735788657f7bec84
Parents: c3cffa0
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Apr 1 13:25:52 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Apr 1 13:25:52 2015 -0400

----------------------------------------------------------------------
 README.md | 7 ++++---
 1 file changed, 4 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/14970de2/README.md
----------------------------------------------------------------------
diff --git a/README.md b/README.md
index 78a2723..92fec10 100644
--- a/README.md
+++ b/README.md
@@ -78,7 +78,7 @@ The current version of Flux is available in Maven Central at the following coord
 <dependency>
     <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux-core</artifactId>
-    <version>0.2.2-SNAPSHOT</version>
+    <version>0.2.2</version>
 </dependency>
 ```
 
@@ -92,7 +92,7 @@ The example below illustrates Flux usage with the Maven shade plugin:
     <dependency>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux-core</artifactId>
-        <version>0.2.2-SNAPSHOT</version>
+        <version>0.2.2</version>
     </dependency>
 
     <!-- add user dependencies here... -->
@@ -171,6 +171,7 @@ usage: storm jar <my_topology_uber_jar.jar> org.apache.storm.flux.Flux
  -z,--zookeeper <host:port>   When running in local mode, use the
                               ZooKeeper at the specified <host>:<port>
                               instead of the in-process ZooKeeper.
+                              (requires Storm 0.9.3 or later)
 ```
 
 **NOTE:** Flux tries to avoid command line switch collision with the `storm` command, and allows any other command line
@@ -193,7 +194,7 @@ storm jar myTopology-0.1.0-SNAPSHOT.jar org.apache.storm.flux.Flux --remote my_c
 ╚═╝     ╚══════╝ ╚═════╝ ╚═╝  ╚═╝
 +-         Apache Storm        -+
 +-  data FLow User eXperience  -+
-Version: 0.2.2-SNAPSHOT
+Version: 0.2.2
 Parsing file: /Users/hsimpson/Projects/donut_domination/storm/shell_test.yaml
 ---------- TOPOLOGY DETAILS ----------
 Name: shell-topology


[06/50] [abbrv] storm git commit: add basic instructions for examples

Posted by pt...@apache.org.
add basic instructions for examples


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/8b690e63
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/8b690e63
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/8b690e63

Branch: refs/heads/0.10.x-branch
Commit: 8b690e63954b14aa8f079f904d42d2f4beb34cc6
Parents: 2e44c9e
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Thu Apr 2 12:19:13 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Thu Apr 2 12:19:13 2015 -0400

----------------------------------------------------------------------
 flux-examples/README.md | 28 ++++++++++++++++++++++++++++
 1 file changed, 28 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/8b690e63/flux-examples/README.md
----------------------------------------------------------------------
diff --git a/flux-examples/README.md b/flux-examples/README.md
new file mode 100644
index 0000000..2f107e7
--- /dev/null
+++ b/flux-examples/README.md
@@ -0,0 +1,28 @@
+# Flux Examples
+A collection of examples illustrating various capabilities.
+
+## Building From Source and Running
+
+Checkout the projects source and perform a top level Maven build (i.e. from the `flux` directory):
+
+```bash
+git clone https://github.com/ptgoetz/flux.git
+cd flux
+mvn install
+```
+
+This will create a shaded (i.e. "fat" or "uber") jar in the `flux-examples/target` directory that can run/deployed with
+the `storm` command:
+
+```bash
+cd flux-examples
+storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_wordcount.yaml
+```
+
+The example YAML files are also packaged in the examples jar, so they can also be referenced with Flux's `--resource`
+command line switch:
+
+```bash
+storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local --resource /sime_wordcount.yaml
+```
+


[49/50] [abbrv] storm git commit: Merge branch 'master' into 0.10.x-branch

Posted by pt...@apache.org.
Merge branch 'master' into 0.10.x-branch


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/c3cc4dc8
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/c3cc4dc8
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/c3cc4dc8

Branch: refs/heads/0.10.x-branch
Commit: c3cc4dc893d3a3886e1c28745960df5268509d22
Parents: ea81f96 2154048
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Jun 3 20:39:38 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Jun 3 20:39:38 2015 -0400

----------------------------------------------------------------------
 .travis.yml                                     |  11 +
 LICENSE                                         |   4 +-
 conf/jaas_kerberos.conf                         |  17 ++
 dev-tools/test-ns.py                            |  15 ++
 doap_Storm.rdf                                  |   6 +-
 external/flux/LICENSE                           | 202 -------------------
 .../apache/storm/flux/test/SimpleTopology.java  |  17 ++
 .../storm/flux/test/SimpleTopologySource.java   |  17 ++
 .../test/SimpleTopologyWithConfigParam.java     |  17 ++
 .../org/apache/storm/flux/test/TestBolt.java    |  17 ++
 .../storm/flux/test/TridentTopologySource.java  |  17 ++
 .../existing-topology-method-override.yaml      |  15 ++
 .../existing-topology-reflection-config.yaml    |  15 ++
 .../configs/existing-topology-reflection.yaml   |  15 ++
 .../configs/existing-topology-trident.yaml      |  15 ++
 .../resources/configs/existing-topology.yaml    |  15 ++
 .../configs/invalid-existing-topology.yaml      |  16 ++
 .../src/test/resources/configs/test.properties  |  16 ++
 .../src/main/resources/config.properties        |  15 ++
 external/storm-hbase/LICENSE                    | 202 -------------------
 .../storm/hive/trident/HiveStateFactory.java    |  17 ++
 .../apache/storm/hive/trident/HiveUpdater.java  |  17 ++
 external/storm-jdbc/LICENSE                     | 202 -------------------
 .../storm/jdbc/common/ConnectionProvider.java   |  17 ++
 .../jdbc/common/HikariCPConnectionProvider.java |  17 ++
 .../storm/jdbc/mapper/JdbcLookupMapper.java     |  17 ++
 .../jdbc/mapper/SimpleJdbcLookupMapper.java     |  17 ++
 external/storm-jdbc/src/test/sql/test.sql       |  17 ++
 .../ExponentialBackoffMsgRetryManagerTest.java  |  17 ++
 external/storm-redis/LICENSE                    | 202 -------------------
 .../redis/trident/WordCountLookupMapper.java    |  17 ++
 .../redis/trident/WordCountStoreMapper.java     |  17 ++
 pom.xml                                         |  17 +-
 storm-core/src/clj/backtype/storm/converter.clj |  15 ++
 .../src/dev/drpc-simple-acl-test-scenario.yaml  |  17 ++
 .../storm/messaging/ConnectionWithStatus.java   |  17 ++
 .../auth/authorizer/DRPCAuthorizerBase.java     |  17 ++
 .../authorizer/DRPCSimpleACLAuthorizer.java     |  18 ++
 .../authorizer/ImpersonationAuthorizer.java     |  17 ++
 .../auth/kerberos/jaas_kerberos_cluster.conf    |  20 +-
 .../auth/kerberos/jaas_kerberos_launcher.conf   |  19 ++
 .../worker-launcher/.deps/worker-launcher.Po    |  16 ++
 .../auth/DefaultHttpCredentialsPlugin_test.clj  |  15 ++
 .../authorizer/DRPCSimpleACLAuthorizer_test.clj |  15 ++
 .../storm/security/auth/drpc-auth-alice.jaas    |  17 ++
 .../storm/security/auth/drpc-auth-bob.jaas      |  17 ++
 .../storm/security/auth/drpc-auth-charlie.jaas  |  17 ++
 .../storm/security/auth/drpc-auth-server.jaas   |  17 ++
 48 files changed, 692 insertions(+), 817 deletions(-)
----------------------------------------------------------------------



[44/50] [abbrv] storm git commit: Merge branch 'STORM-561'

Posted by pt...@apache.org.
Merge branch 'STORM-561'


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/cb370a99
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/cb370a99
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/cb370a99

Branch: refs/heads/0.10.x-branch
Commit: cb370a99a12a8466b4acd644bfe4eba00ee26107
Parents: 2313775 b90ec78
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Jun 3 13:21:16 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Jun 3 13:21:16 2015 -0400

----------------------------------------------------------------------
 external/flux/.gitignore                        |  15 +
 external/flux/LICENSE                           | 202 +++++
 external/flux/README.md                         | 834 +++++++++++++++++++
 external/flux/flux-core/pom.xml                 |  92 ++
 .../main/java/org/apache/storm/flux/Flux.java   | 263 ++++++
 .../java/org/apache/storm/flux/FluxBuilder.java | 591 +++++++++++++
 .../apache/storm/flux/api/TopologySource.java   |  39 +
 .../org/apache/storm/flux/model/BeanDef.java    |  39 +
 .../apache/storm/flux/model/BeanReference.java  |  39 +
 .../org/apache/storm/flux/model/BoltDef.java    |  24 +
 .../storm/flux/model/ConfigMethodDef.java       |  62 ++
 .../storm/flux/model/ExecutionContext.java      |  77 ++
 .../apache/storm/flux/model/GroupingDef.java    |  77 ++
 .../org/apache/storm/flux/model/IncludeDef.java |  54 ++
 .../org/apache/storm/flux/model/ObjectDef.java  |  90 ++
 .../apache/storm/flux/model/PropertyDef.java    |  58 ++
 .../org/apache/storm/flux/model/SpoutDef.java   |  24 +
 .../org/apache/storm/flux/model/StreamDef.java  |  64 ++
 .../apache/storm/flux/model/TopologyDef.java    | 216 +++++
 .../storm/flux/model/TopologySourceDef.java     |  36 +
 .../org/apache/storm/flux/model/VertexDef.java  |  36 +
 .../apache/storm/flux/parser/FluxParser.java    | 202 +++++
 .../flux-core/src/main/resources/splash.txt     |   9 +
 .../org/apache/storm/flux/FluxBuilderTest.java  |  31 +
 .../org/apache/storm/flux/IntegrationTest.java  |  39 +
 .../java/org/apache/storm/flux/TCKTest.java     | 234 ++++++
 .../multilang/MultilangEnvirontmentTest.java    |  89 ++
 .../apache/storm/flux/test/SimpleTopology.java  |  42 +
 .../storm/flux/test/SimpleTopologySource.java   |  35 +
 .../test/SimpleTopologyWithConfigParam.java     |  38 +
 .../org/apache/storm/flux/test/TestBolt.java    |  63 ++
 .../storm/flux/test/TridentTopologySource.java  |  54 ++
 .../src/test/resources/configs/bad_hbase.yaml   |  98 +++
 .../resources/configs/config-methods-test.yaml  |  70 ++
 .../existing-topology-method-override.yaml      |  10 +
 .../existing-topology-reflection-config.yaml    |   9 +
 .../configs/existing-topology-reflection.yaml   |   9 +
 .../configs/existing-topology-trident.yaml      |   9 +
 .../resources/configs/existing-topology.yaml    |   8 +
 .../src/test/resources/configs/hdfs_test.yaml   |  97 +++
 .../test/resources/configs/include_test.yaml    |  25 +
 .../configs/invalid-existing-topology.yaml      |  17 +
 .../src/test/resources/configs/kafka_test.yaml  | 126 +++
 .../src/test/resources/configs/shell_test.yaml  | 104 +++
 .../test/resources/configs/simple_hbase.yaml    | 120 +++
 .../resources/configs/substitution-test.yaml    | 106 +++
 .../src/test/resources/configs/tck.yaml         |  95 +++
 .../src/test/resources/configs/test.properties  |   2 +
 .../flux-core/src/test/resources/logback.xml    |  30 +
 external/flux/flux-examples/README.md           |  66 ++
 external/flux/flux-examples/pom.xml             | 105 +++
 .../storm/flux/examples/WordCountClient.java    |  74 ++
 .../apache/storm/flux/examples/WordCounter.java |  71 ++
 .../src/main/resources/hbase_bolt.properties    |  18 +
 .../src/main/resources/hdfs_bolt.properties     |  26 +
 .../src/main/resources/kafka_spout.yaml         | 136 +++
 .../src/main/resources/multilang.yaml           |  89 ++
 .../src/main/resources/simple_hbase.yaml        |  92 ++
 .../src/main/resources/simple_hdfs.yaml         | 105 +++
 .../src/main/resources/simple_wordcount.yaml    |  68 ++
 external/flux/flux-ui/README.md                 |   3 +
 external/flux/flux-wrappers/pom.xml             |  51 ++
 .../flux/wrappers/bolts/FluxShellBolt.java      |  56 ++
 .../storm/flux/wrappers/bolts/LogInfoBolt.java  |  44 +
 .../flux/wrappers/spouts/FluxShellSpout.java    |  55 ++
 .../main/resources/resources/randomsentence.js  |  93 +++
 .../main/resources/resources/splitsentence.py   |  24 +
 external/flux/pom.xml                           | 119 +++
 pom.xml                                         |   1 +
 storm-dist/binary/src/main/assembly/binary.xml  |  44 +
 70 files changed, 6043 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/cb370a99/pom.xml
----------------------------------------------------------------------


[31/50] [abbrv] storm git commit: add missing kafka dependency to flux-examples

Posted by pt...@apache.org.
add missing kafka dependency to flux-examples


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/8c9e6cee
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/8c9e6cee
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/8c9e6cee

Branch: refs/heads/0.10.x-branch
Commit: 8c9e6ceeafc1f50eebc519c8f03d5821813ee99b
Parents: 9fad816
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Fri May 8 15:35:11 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Fri May 8 15:35:11 2015 -0400

----------------------------------------------------------------------
 .../org/apache/storm/flux/IntegrationTest.java  |  2 --
 external/flux/flux-examples/pom.xml             | 20 ++++++++++++++++++++
 2 files changed, 20 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/8c9e6cee/external/flux/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java b/external/flux/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
index 5e17f5e..c5807f8 100644
--- a/external/flux/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
@@ -30,8 +30,6 @@ public class IntegrationTest {
         }
     }
 
-
-
     @Test
     public void testRunTopologySource() throws Exception {
         if(!skipTest) {

http://git-wip-us.apache.org/repos/asf/storm/blob/8c9e6cee/external/flux/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/external/flux/flux-examples/pom.xml b/external/flux/flux-examples/pom.xml
index 571f302..e3996e5 100644
--- a/external/flux/flux-examples/pom.xml
+++ b/external/flux/flux-examples/pom.xml
@@ -52,6 +52,26 @@
             <artifactId>storm-hbase</artifactId>
             <version>${project.version}</version>
         </dependency>
+        <dependency>
+            <groupId>org.apache.storm</groupId>
+            <artifactId>storm-kafka</artifactId>
+            <version>${project.version}</version>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.kafka</groupId>
+            <artifactId>kafka_2.10</artifactId>
+            <version>0.8.1.1</version>
+            <exclusions>
+                <exclusion>
+                    <groupId>org.apache.zookeeper</groupId>
+                    <artifactId>zookeeper</artifactId>
+                </exclusion>
+                <exclusion>
+                    <groupId>log4j</groupId>
+                    <artifactId>log4j</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
     </dependencies>
 
     <build>


[46/50] [abbrv] storm git commit: Merge branch 'master' into 0.10.x-branch

Posted by pt...@apache.org.
Merge branch 'master' into 0.10.x-branch


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/ea81f96d
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/ea81f96d
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/ea81f96d

Branch: refs/heads/0.10.x-branch
Commit: ea81f96df6eedffe668b44201c441095760c74da
Parents: efdf949 285d943
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Jun 3 13:38:51 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Jun 3 13:38:51 2015 -0400

----------------------------------------------------------------------
 CHANGELOG.md                                    |   3 +
 external/flux/.gitignore                        |  15 +
 external/flux/LICENSE                           | 202 +++++
 external/flux/README.md                         | 834 +++++++++++++++++++
 external/flux/flux-core/pom.xml                 |  92 ++
 .../main/java/org/apache/storm/flux/Flux.java   | 263 ++++++
 .../java/org/apache/storm/flux/FluxBuilder.java | 591 +++++++++++++
 .../apache/storm/flux/api/TopologySource.java   |  39 +
 .../org/apache/storm/flux/model/BeanDef.java    |  39 +
 .../apache/storm/flux/model/BeanReference.java  |  39 +
 .../org/apache/storm/flux/model/BoltDef.java    |  24 +
 .../storm/flux/model/ConfigMethodDef.java       |  62 ++
 .../storm/flux/model/ExecutionContext.java      |  77 ++
 .../apache/storm/flux/model/GroupingDef.java    |  77 ++
 .../org/apache/storm/flux/model/IncludeDef.java |  54 ++
 .../org/apache/storm/flux/model/ObjectDef.java  |  90 ++
 .../apache/storm/flux/model/PropertyDef.java    |  58 ++
 .../org/apache/storm/flux/model/SpoutDef.java   |  24 +
 .../org/apache/storm/flux/model/StreamDef.java  |  64 ++
 .../apache/storm/flux/model/TopologyDef.java    | 216 +++++
 .../storm/flux/model/TopologySourceDef.java     |  36 +
 .../org/apache/storm/flux/model/VertexDef.java  |  36 +
 .../apache/storm/flux/parser/FluxParser.java    | 202 +++++
 .../flux-core/src/main/resources/splash.txt     |   9 +
 .../org/apache/storm/flux/FluxBuilderTest.java  |  31 +
 .../org/apache/storm/flux/IntegrationTest.java  |  39 +
 .../java/org/apache/storm/flux/TCKTest.java     | 234 ++++++
 .../multilang/MultilangEnvirontmentTest.java    |  89 ++
 .../apache/storm/flux/test/SimpleTopology.java  |  42 +
 .../storm/flux/test/SimpleTopologySource.java   |  35 +
 .../test/SimpleTopologyWithConfigParam.java     |  38 +
 .../org/apache/storm/flux/test/TestBolt.java    |  63 ++
 .../storm/flux/test/TridentTopologySource.java  |  54 ++
 .../src/test/resources/configs/bad_hbase.yaml   |  98 +++
 .../resources/configs/config-methods-test.yaml  |  70 ++
 .../existing-topology-method-override.yaml      |  10 +
 .../existing-topology-reflection-config.yaml    |   9 +
 .../configs/existing-topology-reflection.yaml   |   9 +
 .../configs/existing-topology-trident.yaml      |   9 +
 .../resources/configs/existing-topology.yaml    |   8 +
 .../src/test/resources/configs/hdfs_test.yaml   |  97 +++
 .../test/resources/configs/include_test.yaml    |  25 +
 .../configs/invalid-existing-topology.yaml      |  17 +
 .../src/test/resources/configs/kafka_test.yaml  | 126 +++
 .../src/test/resources/configs/shell_test.yaml  | 104 +++
 .../test/resources/configs/simple_hbase.yaml    | 120 +++
 .../resources/configs/substitution-test.yaml    | 106 +++
 .../src/test/resources/configs/tck.yaml         |  95 +++
 .../src/test/resources/configs/test.properties  |   2 +
 .../flux-core/src/test/resources/logback.xml    |  30 +
 external/flux/flux-examples/README.md           |  66 ++
 external/flux/flux-examples/pom.xml             | 105 +++
 .../storm/flux/examples/WordCountClient.java    |  74 ++
 .../apache/storm/flux/examples/WordCounter.java |  71 ++
 .../src/main/resources/hbase_bolt.properties    |  18 +
 .../src/main/resources/hdfs_bolt.properties     |  26 +
 .../src/main/resources/kafka_spout.yaml         | 136 +++
 .../src/main/resources/multilang.yaml           |  89 ++
 .../src/main/resources/simple_hbase.yaml        |  92 ++
 .../src/main/resources/simple_hdfs.yaml         | 105 +++
 .../src/main/resources/simple_wordcount.yaml    |  68 ++
 external/flux/flux-ui/README.md                 |   3 +
 external/flux/flux-wrappers/pom.xml             |  51 ++
 .../flux/wrappers/bolts/FluxShellBolt.java      |  56 ++
 .../storm/flux/wrappers/bolts/LogInfoBolt.java  |  44 +
 .../flux/wrappers/spouts/FluxShellSpout.java    |  55 ++
 .../main/resources/resources/randomsentence.js  |  93 +++
 .../main/resources/resources/splitsentence.py   |  24 +
 external/flux/pom.xml                           | 119 +++
 external/storm-eventhubs/pom.xml                |  38 +-
 .../eventhubs/bolt/DefaultEventDataFormat.java  |  47 ++
 .../storm/eventhubs/bolt/EventHubBolt.java      |  56 +-
 .../eventhubs/bolt/EventHubBoltConfig.java      | 109 +++
 .../storm/eventhubs/bolt/IEventDataFormat.java  |  28 +
 .../client/ConnectionStringBuilder.java         | 116 ---
 .../storm/eventhubs/client/Constants.java       |  32 -
 .../storm/eventhubs/client/EventHubClient.java  |  92 --
 .../eventhubs/client/EventHubConsumerGroup.java |  72 --
 .../eventhubs/client/EventHubException.java     |  37 -
 .../eventhubs/client/EventHubReceiver.java      | 139 ----
 .../eventhubs/client/EventHubSendClient.java    |  70 --
 .../storm/eventhubs/client/EventHubSender.java  |  95 ---
 .../storm/eventhubs/client/SelectorFilter.java  |  38 -
 .../eventhubs/client/SelectorFilterWriter.java  |  64 --
 .../storm/eventhubs/samples/EventCount.java     |   5 +-
 .../storm/eventhubs/samples/EventHubLoop.java   |   9 +-
 .../eventhubs/spout/EventHubReceiverFilter.java |  56 --
 .../eventhubs/spout/EventHubReceiverImpl.java   |  49 +-
 .../storm/eventhubs/spout/EventHubSpout.java    |   5 +
 .../eventhubs/spout/EventHubSpoutConfig.java    | 126 +--
 .../eventhubs/spout/IEventHubReceiver.java      |   5 +-
 .../spout/IEventHubReceiverFilter.java          |  35 -
 .../eventhubs/spout/SimplePartitionManager.java |  11 +-
 .../spout/StaticPartitionCoordinator.java       |   2 +-
 .../TransactionalTridentEventHubEmitter.java    |   2 +-
 .../trident/TridentPartitionManager.java        |  12 +-
 .../src/main/resources/config.properties        |   5 +-
 .../eventhubs/spout/EventHubReceiverMock.java   |  18 +-
 .../eventhubs/spout/TestEventHubSpout.java      |   4 +-
 pom.xml                                         |   5 +-
 storm-dist/binary/src/main/assembly/binary.xml  |  44 +
 101 files changed, 6424 insertions(+), 1003 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/ea81f96d/CHANGELOG.md
----------------------------------------------------------------------
diff --cc CHANGELOG.md
index 09f2e94,aa390f1..34b2e57
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@@ -1,4 -1,7 +1,5 @@@
 -## 0.11.0
 -
  ## 0.10.0
+  * STORM-842: Drop Support for Java 1.6
   * STORM-835: Netty Client hold batch object until io operation complete
   * STORM-827: Allow AutoTGT to work with storm-hdfs too.
   * STORM-821: Adding connection provider interface to decouple jdbc connector from a single connection pooling implementation.


[33/50] [abbrv] storm git commit: storm-eventhubs improvement

Posted by pt...@apache.org.
storm-eventhubs improvement

EventHubBolt add event formatter to format tuples into bytes
Refactor EventHubSpoutConfig
Add support for specifying consumer group name
Workaround for Qpid issue that in rare cases messages cannot be received

Signed-off-by: Shanyu Zhao <sh...@microsoft.com>


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/e5154928
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/e5154928
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/e5154928

Branch: refs/heads/0.10.x-branch
Commit: e515492864092c16c2ecc86577a3be73ec79fa56
Parents: 847958c
Author: Shanyu Zhao <sh...@microsoft.com>
Authored: Wed May 13 12:26:53 2015 -0700
Committer: Shanyu Zhao <sh...@microsoft.com>
Committed: Wed May 13 12:26:53 2015 -0700

----------------------------------------------------------------------
 external/storm-eventhubs/pom.xml                | 239 +++++++++----------
 .../eventhubs/bolt/DefaultEventDataFormat.java  |  47 ++++
 .../storm/eventhubs/bolt/EventHubBolt.java      | 182 +++++++-------
 .../eventhubs/bolt/EventHubBoltConfig.java      | 107 +++++++++
 .../storm/eventhubs/bolt/IEventDataFormat.java  |  28 +++
 .../storm/eventhubs/client/EventHubClient.java  | 187 ++++++++-------
 .../storm/eventhubs/client/EventHubSender.java  | 194 +++++++--------
 .../storm/eventhubs/samples/EventCount.java     |   5 +-
 .../storm/eventhubs/samples/EventHubLoop.java   | 103 ++++----
 .../eventhubs/spout/EventHubReceiverImpl.java   |  20 +-
 .../storm/eventhubs/spout/EventHubSpout.java    |   5 +
 .../eventhubs/spout/EventHubSpoutConfig.java    | 105 +++++---
 .../src/main/resources/config.properties        |   5 +-
 .../eventhubs/spout/TestEventHubSpout.java      |   4 +-
 14 files changed, 750 insertions(+), 481 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/pom.xml
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/pom.xml b/external/storm-eventhubs/pom.xml
index 5ed65c7..2ceed09 100755
--- a/external/storm-eventhubs/pom.xml
+++ b/external/storm-eventhubs/pom.xml
@@ -1,122 +1,119 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
- Licensed to the Apache Software Foundation (ASF) under one or more
- contributor license agreements.  See the NOTICE file distributed with
- this work for additional information regarding copyright ownership.
- The ASF licenses this file to You under the Apache License, Version 2.0
- (the "License"); you may not use this file except in compliance with
- the License.  You may obtain a copy of the License at
-
-     http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
--->
-<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
-    <modelVersion>4.0.0</modelVersion>
-    
-    <parent>
-        <artifactId>storm</artifactId>
-        <groupId>org.apache.storm</groupId>
-        <version>0.11.0-SNAPSHOT</version>
-        <relativePath>../../pom.xml</relativePath>
-    </parent>
-    
-    <artifactId>storm-eventhubs</artifactId>
-    <version>0.11.0-SNAPSHOT</version>
-    <packaging>jar</packaging>
-    <name>storm-eventhubs</name>
-    <description>EventHubs Storm Spout</description>
-
-    <properties>
-        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
-        <qpid.version>0.28</qpid.version>
-    </properties>
-    <build>
-        <plugins>
-            <plugin>
-                <groupId>org.apache.maven.plugins</groupId>
-                <artifactId>maven-assembly-plugin</artifactId>
-                <version>2.4.1</version>
-                <executions>
-                    <execution>
-                        <goals>
-                            <goal>attached</goal>
-                        </goals>
-                        <phase>package</phase>
-                        <configuration>
-                            <descriptorRefs>
-                                <descriptorRef>jar-with-dependencies</descriptorRef>
-                            </descriptorRefs>
-                            <archive>
-                                <manifest>
-                                    <mainClass>org.apache.storm.eventhubs.samples.EventCount</mainClass>
-                                </manifest>
-                            </archive>
-                        </configuration>
-                    </execution>
-                </executions>
-            </plugin>
-            <plugin>
-		        <artifactId>maven-antrun-plugin</artifactId>
-		        <executions>
-		          <execution>
-		            <phase>package</phase>
-		            <configuration>
-		              <tasks>
-		                <copy file="src/main/resources/config.properties" tofile="target/eventhubs-config.properties"/>
-                    </tasks>
-		            </configuration>
-		            <goals>
-		              <goal>run</goal>
-		            </goals>
-		          </execution>
-		        </executions>
-	        </plugin>
-        </plugins>
-    </build>
-    <dependencies>
-        <dependency>
-            <groupId>org.apache.qpid</groupId>
-            <artifactId>qpid-client</artifactId>
-            <version>${qpid.version}</version>
-        </dependency>
-        <dependency>
-            <groupId>org.apache.qpid</groupId>
-            <artifactId>qpid-amqp-1-0-client-jms</artifactId>
-            <version>${qpid.version}</version>
-        </dependency>
-        <dependency>
-            <groupId>org.apache.storm</groupId>
-            <artifactId>storm-core</artifactId>
-            <version>${project.version}</version>
-            <!-- keep storm out of the jar-with-dependencies -->
-            <type>jar</type>
-            <scope>provided</scope>
-        </dependency>
-        <dependency>
-            <groupId>org.apache.curator</groupId>
-            <artifactId>curator-framework</artifactId>
-            <version>${curator.version}</version>
-            <exclusions>
-                <exclusion>
-                    <groupId>log4j</groupId>
-                    <artifactId>log4j</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.slf4j</groupId>
-                    <artifactId>slf4j-log4j12</artifactId>
-                </exclusion>
-            </exclusions>
-        </dependency>
-        <dependency>
-            <groupId>junit</groupId>
-            <artifactId>junit</artifactId>
-            <version>4.11</version>
-            <scope>test</scope>
-        </dependency>
-    </dependencies> 
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one or more
+ contributor license agreements.  See the NOTICE file distributed with
+ this work for additional information regarding copyright ownership.
+ The ASF licenses this file to You under the Apache License, Version 2.0
+ (the "License"); you may not use this file except in compliance with
+ the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+    <modelVersion>4.0.0</modelVersion>
+    
+    <parent>
+        <artifactId>storm</artifactId>
+        <groupId>org.apache.storm</groupId>
+        <version>0.11.0-SNAPSHOT</version>
+        <relativePath>../../pom.xml</relativePath>
+    </parent>
+    
+    <artifactId>storm-eventhubs</artifactId>
+    <version>0.11.0-SNAPSHOT</version>
+    <packaging>jar</packaging>
+    <name>storm-eventhubs</name>
+    <description>EventHubs Storm Spout</description>
+
+    <properties>
+        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
+        <qpid.version>0.32</qpid.version>
+    </properties>
+    <build>
+        <plugins>
+            <plugin>
+                <groupId>org.apache.maven.plugins</groupId>
+                <artifactId>maven-shade-plugin</artifactId>
+                <version>2.3</version>
+                <executions>
+                    <execution>
+                        <goals>
+                            <goal>shade</goal>
+                        </goals>
+                        <phase>package</phase>
+                    </execution>
+                </executions>
+                <configuration>
+                    <transformers>
+                        <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer">
+                        </transformer>
+                    </transformers>
+                    <outputFile>target/${project.artifactId}-${project.version}-jar-with-dependencies.jar</outputFile>
+                </configuration>
+	        </plugin>
+            <plugin>
+		        <artifactId>maven-antrun-plugin</artifactId>
+		        <executions>
+		          <execution>
+		            <phase>package</phase>
+		            <configuration>
+		              <tasks>
+		                <copy file="src/main/resources/config.properties" tofile="target/eventhubs-config.properties"/>
+                    </tasks>
+		            </configuration>
+		            <goals>
+		              <goal>run</goal>
+		            </goals>
+		          </execution>
+		        </executions>
+	        </plugin>
+        </plugins>
+    </build>
+    <dependencies>
+        <dependency>
+            <groupId>org.apache.qpid</groupId>
+            <artifactId>qpid-client</artifactId>
+            <version>${qpid.version}</version>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.qpid</groupId>
+            <artifactId>qpid-amqp-1-0-client-jms</artifactId>
+            <version>${qpid.version}</version>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.storm</groupId>
+            <artifactId>storm-core</artifactId>
+            <version>${project.version}</version>
+            <!-- keep storm out of the jar-with-dependencies -->
+            <type>jar</type>
+            <scope>provided</scope>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.curator</groupId>
+            <artifactId>curator-framework</artifactId>
+            <version>${curator.version}</version>
+            <exclusions>
+                <exclusion>
+                    <groupId>log4j</groupId>
+                    <artifactId>log4j</artifactId>
+                </exclusion>
+                <exclusion>
+                    <groupId>org.slf4j</groupId>
+                    <artifactId>slf4j-log4j12</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
+        <dependency>
+            <groupId>junit</groupId>
+            <artifactId>junit</artifactId>
+            <version>4.11</version>
+            <scope>test</scope>
+        </dependency>
+    </dependencies> 
 </project>
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/DefaultEventDataFormat.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/DefaultEventDataFormat.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/DefaultEventDataFormat.java
new file mode 100644
index 0000000..1bd8288
--- /dev/null
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/DefaultEventDataFormat.java
@@ -0,0 +1,47 @@
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.bolt;
+
+import backtype.storm.tuple.Tuple;
+
+/**
+ * A default implementation of IEventDataFormat that converts the tuple
+ * into a delimited string.
+ */
+public class DefaultEventDataFormat implements IEventDataFormat {
+  private static final long serialVersionUID = 1L;
+  private String delimiter = ",";
+  
+  public DefaultEventDataFormat withFieldDelimiter(String delimiter) {
+    this.delimiter = delimiter;
+    return this;
+  }
+
+  @Override
+  public byte[] serialize(Tuple tuple) {
+    StringBuilder sb = new StringBuilder();
+    for(Object obj : tuple.getValues()) {
+      if(sb.length() != 0) {
+        sb.append(delimiter);
+      }
+      sb.append(obj.toString());
+    }
+    return sb.toString().getBytes();
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
index 8016be3..09f90b1 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
@@ -1,81 +1,101 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.bolt;
-
-import java.util.Map;
-
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import org.apache.storm.eventhubs.client.EventHubClient;
-import org.apache.storm.eventhubs.client.EventHubException;
-import org.apache.storm.eventhubs.client.EventHubSender;
-
-import backtype.storm.task.TopologyContext;
-import backtype.storm.topology.BasicOutputCollector;
-import backtype.storm.topology.OutputFieldsDeclarer;
-import backtype.storm.topology.base.BaseBasicBolt;
-import backtype.storm.tuple.Tuple;
-
-/**
- * A bolt that writes message to EventHub.
- * We assume the incoming tuple has only one field which is a string.
- */
-public class EventHubBolt extends BaseBasicBolt {
-  private static final long serialVersionUID = 1L;
-  private static final Logger logger = LoggerFactory
-      .getLogger(EventHubBolt.class);
-  
-  private EventHubSender sender;
-  private String connectionString;
-  private String entityPath;
-  
-  public EventHubBolt(String connectionString, String entityPath) {
-    this.connectionString = connectionString;
-    this.entityPath = entityPath;
-  }
-  
-  @Override
-  public void prepare(Map config, TopologyContext context) {
-    try {
-      EventHubClient eventHubClient = EventHubClient.create(connectionString, entityPath);
-      sender = eventHubClient.createPartitionSender(null);
-    }
-    catch(Exception ex) {
-      logger.error(ex.getMessage());
-      throw new RuntimeException(ex);
-    }
-
-  }
-
-  @Override
-  public void execute(Tuple tuple, BasicOutputCollector collector) {
-    try {
-      sender.send((String)tuple.getValue(0));
-    }
-    catch(EventHubException ex) {
-      logger.error(ex.getMessage());
-    }
-  }
-
-  @Override
-  public void declareOutputFields(OutputFieldsDeclarer declarer) {
-    
-  }
-
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.bolt;
+
+import java.util.Map;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import org.apache.storm.eventhubs.client.EventHubClient;
+import org.apache.storm.eventhubs.client.EventHubException;
+import org.apache.storm.eventhubs.client.EventHubSender;
+
+import backtype.storm.task.OutputCollector;
+import backtype.storm.task.TopologyContext;
+import backtype.storm.topology.OutputFieldsDeclarer;
+import backtype.storm.topology.base.BaseRichBolt;
+import backtype.storm.tuple.Tuple;
+
+/**
+ * A bolt that writes event message to EventHub.
+ */
+public class EventHubBolt extends BaseRichBolt {
+  private static final long serialVersionUID = 1L;
+  private static final Logger logger = LoggerFactory
+      .getLogger(EventHubBolt.class);
+  
+  protected OutputCollector collector;
+  protected EventHubSender sender;
+  protected EventHubBoltConfig boltConfig;
+  
+  
+  public EventHubBolt(String connectionString, String entityPath) {
+    boltConfig = new EventHubBoltConfig(connectionString, entityPath);
+  }
+
+  public EventHubBolt(String userName, String password, String namespace,
+      String entityPath, boolean partitionMode) {
+    boltConfig = new EventHubBoltConfig(userName, password, namespace,
+        entityPath, partitionMode);
+  }
+  
+  public EventHubBolt(EventHubBoltConfig config) {
+    boltConfig = config;
+  }
+
+  @Override
+  public void prepare(Map config, TopologyContext context, OutputCollector collector) {
+    this.collector = collector;
+    String myPartitionId = null;
+    if(boltConfig.getPartitionMode()) {
+      //We can use the task index (starting from 0) as the partition ID
+      myPartitionId = "" + context.getThisTaskIndex();
+    }
+    logger.info("creating sender: " + boltConfig.getConnectionString()
+        + ", " + boltConfig.getEntityPath() + ", " + myPartitionId);
+    try {
+      EventHubClient eventHubClient = EventHubClient.create(
+          boltConfig.getConnectionString(), boltConfig.getEntityPath());
+      sender = eventHubClient.createPartitionSender(myPartitionId);
+    }
+    catch(Exception ex) {
+      logger.error(ex.getMessage());
+      throw new RuntimeException(ex);
+    }
+
+  }
+
+  @Override
+  public void execute(Tuple tuple) {
+    try {
+      sender.send(boltConfig.getEventDataFormat().serialize(tuple));
+      collector.ack(tuple);
+    }
+    catch(EventHubException ex) {
+      logger.error(ex.getMessage());
+      collector.fail(tuple);
+    }
+  }
+
+  @Override
+  public void declareOutputFields(OutputFieldsDeclarer declarer) {
+    
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
new file mode 100644
index 0000000..909e8ac
--- /dev/null
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
@@ -0,0 +1,107 @@
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.bolt;
+
+import java.io.Serializable;
+
+import org.apache.storm.eventhubs.spout.EventHubSpoutConfig;
+
+/*
+ * EventHubs bolt configurations
+ *
+ * Partition mode:
+ * With partitionMode=true you need to create the same number of tasks as the number of 
+ * EventHubs partitions, and each bolt task will only send data to one partition.
+ * The partition ID is the task ID of the bolt.
+ * 
+ * Event format:
+ * The formatter to convert tuple to bytes for EventHubs.
+ * if null, the default format is common delimited tuple fields.
+ */
+public class EventHubBoltConfig implements Serializable {
+  private static final long serialVersionUID = 1L;
+  
+  private String connectionString;
+  private final String entityPath;
+  protected boolean partitionMode;
+  protected IEventDataFormat dataFormat;
+  
+  public EventHubBoltConfig(String connectionString, String entityPath) {
+    this(connectionString, entityPath, false, null);
+  }
+  
+  public EventHubBoltConfig(String connectionString, String entityPath,
+      boolean partitionMode) {
+    this(connectionString, entityPath, partitionMode, null);
+  }
+  
+  public EventHubBoltConfig(String userName, String password, String namespace,
+      String entityPath, boolean partitionMode) {
+    this(userName, password, namespace,
+        EventHubSpoutConfig.EH_SERVICE_FQDN_SUFFIX, entityPath, partitionMode);
+  }
+  
+  public EventHubBoltConfig(String connectionString, String entityPath,
+      boolean partitionMode, IEventDataFormat dataFormat) {
+    this.connectionString = connectionString;
+    this.entityPath = entityPath;
+    this.partitionMode = partitionMode;
+    this.dataFormat = dataFormat;
+    if(this.dataFormat == null) {
+      this.dataFormat = new DefaultEventDataFormat();
+    }
+  }
+  
+  public EventHubBoltConfig(String userName, String password, String namespace,
+      String targetFqnAddress, String entityPath) {
+    this(userName, password, namespace, targetFqnAddress, entityPath, false, null);
+  }
+  
+  public EventHubBoltConfig(String userName, String password, String namespace,
+      String targetFqnAddress, String entityPath, boolean partitionMode) {
+    this(userName, password, namespace, targetFqnAddress, entityPath, partitionMode, null);
+  }
+  
+  public EventHubBoltConfig(String userName, String password, String namespace,
+      String targetFqnAddress, String entityPath, boolean partitionMode,
+      IEventDataFormat dataFormat) {
+    this.connectionString = EventHubSpoutConfig.buildConnectionString(userName, password, namespace, targetFqnAddress);
+    this.entityPath = entityPath;
+    this.partitionMode = partitionMode;
+    this.dataFormat = dataFormat;
+    if(this.dataFormat == null) {
+      this.dataFormat = new DefaultEventDataFormat();
+    }
+  }
+  
+  public String getConnectionString() {
+    return connectionString;
+  }
+  
+  public String getEntityPath() {
+    return entityPath;
+  }
+  
+  public boolean getPartitionMode() {
+    return partitionMode;
+  }
+  
+  public IEventDataFormat getEventDataFormat() {
+    return dataFormat;
+  }
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/IEventDataFormat.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/IEventDataFormat.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/IEventDataFormat.java
new file mode 100644
index 0000000..cb05c0f
--- /dev/null
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/IEventDataFormat.java
@@ -0,0 +1,28 @@
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.bolt;
+
+import java.io.Serializable;
+import backtype.storm.tuple.Tuple;
+
+/**
+ * Serialize a tuple to a byte array to be sent to EventHubs
+ */
+public interface IEventDataFormat extends Serializable {
+  public byte[] serialize(Tuple tuple);
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
index e06091d..2afe5b4 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
@@ -1,92 +1,95 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import org.apache.qpid.amqp_1_0.client.Connection;
-import org.apache.qpid.amqp_1_0.client.ConnectionErrorException;
-import org.apache.qpid.amqp_1_0.client.ConnectionException;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-public class EventHubClient {
-
-  private static final String DefaultConsumerGroupName = "$default";
-  private static final Logger logger = LoggerFactory.getLogger(EventHubClient.class);
-  private static final long ConnectionSyncTimeout = 60000L;
-
-  private final String connectionString;
-  private final String entityPath;
-  private final Connection connection;
-
-  private EventHubClient(String connectionString, String entityPath) throws EventHubException {
-    this.connectionString = connectionString;
-    this.entityPath = entityPath;
-    this.connection = this.createConnection();
-  }
-
-  /**
-   * creates a new instance of EventHubClient using the supplied connection string and entity path.
-   *
-   * @param connectionString connection string to the namespace of event hubs. connection string format:
-   * amqps://{userId}:{password}@{namespaceName}.servicebus.windows.net
-   * @param entityPath the name of event hub entity.
-   *
-   * @return EventHubClient
-   * @throws org.apache.storm.eventhubs.client.EventHubException
-   */
-  public static EventHubClient create(String connectionString, String entityPath) throws EventHubException {
-    return new EventHubClient(connectionString, entityPath);
-  }
-
-  public EventHubSender createPartitionSender(String partitionId) throws Exception {
-    return new EventHubSender(this.connection.createSession(), this.entityPath, partitionId);
-  }
-
-  public EventHubConsumerGroup getDefaultConsumerGroup() {
-    return new EventHubConsumerGroup(this.connection, this.entityPath, DefaultConsumerGroupName);
-  }
-
-  public void close() {
-    try {
-      this.connection.close();
-    } catch (ConnectionErrorException e) {
-      logger.error(e.toString());
-    }
-  }
-
-  private Connection createConnection() throws EventHubException {
-    ConnectionStringBuilder connectionStringBuilder = new ConnectionStringBuilder(this.connectionString);
-    Connection clientConnection;
-
-    try {
-      clientConnection = new Connection(
-        connectionStringBuilder.getHost(),
-        connectionStringBuilder.getPort(),
-        connectionStringBuilder.getUserName(),
-        connectionStringBuilder.getPassword(),
-        connectionStringBuilder.getHost(),
-        connectionStringBuilder.getSsl());
-    } catch (ConnectionException e) {
-      logger.error(e.toString());
-      throw new EventHubException(e);
-    }
-    clientConnection.getEndpoint().setSyncTimeout(ConnectionSyncTimeout);
-    SelectorFilterWriter.register(clientConnection.getEndpoint().getDescribedTypeRegistry());
-    return clientConnection;
-  }
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.client;
+
+import org.apache.qpid.amqp_1_0.client.Connection;
+import org.apache.qpid.amqp_1_0.client.ConnectionErrorException;
+import org.apache.qpid.amqp_1_0.client.ConnectionException;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class EventHubClient {
+
+  private static final String DefaultConsumerGroupName = "$default";
+  private static final Logger logger = LoggerFactory.getLogger(EventHubClient.class);
+  private static final long ConnectionSyncTimeout = 60000L;
+
+  private final String connectionString;
+  private final String entityPath;
+  private final Connection connection;
+
+  private EventHubClient(String connectionString, String entityPath) throws EventHubException {
+    this.connectionString = connectionString;
+    this.entityPath = entityPath;
+    this.connection = this.createConnection();
+  }
+
+  /**
+   * creates a new instance of EventHubClient using the supplied connection string and entity path.
+   *
+   * @param connectionString connection string to the namespace of event hubs. connection string format:
+   * amqps://{userId}:{password}@{namespaceName}.servicebus.windows.net
+   * @param entityPath the name of event hub entity.
+   *
+   * @return EventHubClient
+   * @throws org.apache.storm.eventhubs.client.EventHubException
+   */
+  public static EventHubClient create(String connectionString, String entityPath) throws EventHubException {
+    return new EventHubClient(connectionString, entityPath);
+  }
+
+  public EventHubSender createPartitionSender(String partitionId) throws Exception {
+    return new EventHubSender(this.connection.createSession(), this.entityPath, partitionId);
+  }
+
+  public EventHubConsumerGroup getConsumerGroup(String cgName) {
+    if(cgName == null || cgName.length() == 0) {
+      cgName = DefaultConsumerGroupName;
+    }
+    return new EventHubConsumerGroup(connection, entityPath, cgName);
+  }
+
+  public void close() {
+    try {
+      this.connection.close();
+    } catch (ConnectionErrorException e) {
+      logger.error(e.toString());
+    }
+  }
+
+  private Connection createConnection() throws EventHubException {
+    ConnectionStringBuilder connectionStringBuilder = new ConnectionStringBuilder(this.connectionString);
+    Connection clientConnection;
+
+    try {
+      clientConnection = new Connection(
+        connectionStringBuilder.getHost(),
+        connectionStringBuilder.getPort(),
+        connectionStringBuilder.getUserName(),
+        connectionStringBuilder.getPassword(),
+        connectionStringBuilder.getHost(),
+        connectionStringBuilder.getSsl());
+    } catch (ConnectionException e) {
+      logger.error(e.toString());
+      throw new EventHubException(e);
+    }
+    clientConnection.getEndpoint().setSyncTimeout(ConnectionSyncTimeout);
+    SelectorFilterWriter.register(clientConnection.getEndpoint().getDescribedTypeRegistry());
+    return clientConnection;
+  }
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
index 41b1d97..7c45578 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
@@ -1,95 +1,99 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import java.util.concurrent.TimeoutException;
-import org.apache.qpid.amqp_1_0.client.LinkDetachedException;
-import org.apache.qpid.amqp_1_0.client.Message;
-import org.apache.qpid.amqp_1_0.client.Sender;
-import org.apache.qpid.amqp_1_0.client.Session;
-import org.apache.qpid.amqp_1_0.type.Binary;
-import org.apache.qpid.amqp_1_0.type.messaging.Data;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-public class EventHubSender {
-
-  private static final Logger logger = LoggerFactory.getLogger(EventHubSender.class);
-
-  private final Session session;
-  private final String entityPath;
-  private final String partitionId;
-  private final String destinationAddress;
-
-  private Sender sender;
-
-  public EventHubSender(Session session, String entityPath, String partitionId) {
-    this.session = session;
-    this.entityPath = entityPath;
-    this.partitionId = partitionId;
-    this.destinationAddress = this.getDestinationAddress();
-  }
-
-  public void send(String data) throws EventHubException {
-    try {
-      if (this.sender == null) {
-        this.ensureSenderCreated();
-      }
-
-      //For interop with other language, convert string to bytes
-      Binary bin = new Binary(data.getBytes());
-      Message message = new Message(new Data(bin));
-      this.sender.send(message);
-
-    } catch (LinkDetachedException e) {
-      logger.error(e.getMessage());
-
-      EventHubException eventHubException = new EventHubException("Sender has been closed");
-      throw eventHubException;
-    } catch (TimeoutException e) {
-      logger.error(e.getMessage());
-
-      EventHubException eventHubException = new EventHubException("Timed out while waiting to get credit to send");
-      throw eventHubException;
-    } catch (Exception e) {
-      logger.error(e.getMessage());
-    }
-  }
-
-  public void close() {
-    try {
-      this.sender.close();
-    } catch (Sender.SenderClosingException e) {
-      logger.error("Closing a sender encountered error: " + e.getMessage());
-    }
-  }
-
-  private String getDestinationAddress() {
-    if (this.partitionId == null || this.partitionId.equals("")) {
-      return this.entityPath;
-    } else {
-      return String.format(Constants.DestinationAddressFormatString, this.entityPath, this.partitionId);
-    }
-  }
-
-  private synchronized void ensureSenderCreated() throws Exception {
-    if (this.sender == null) {
-      this.sender = this.session.createSender(this.destinationAddress);
-    }
-  }
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.client;
+
+import java.util.concurrent.TimeoutException;
+import org.apache.qpid.amqp_1_0.client.LinkDetachedException;
+import org.apache.qpid.amqp_1_0.client.Message;
+import org.apache.qpid.amqp_1_0.client.Sender;
+import org.apache.qpid.amqp_1_0.client.Session;
+import org.apache.qpid.amqp_1_0.type.Binary;
+import org.apache.qpid.amqp_1_0.type.messaging.Data;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class EventHubSender {
+
+  private static final Logger logger = LoggerFactory.getLogger(EventHubSender.class);
+
+  private final Session session;
+  private final String entityPath;
+  private final String partitionId;
+  private final String destinationAddress;
+
+  private Sender sender;
+
+  public EventHubSender(Session session, String entityPath, String partitionId) {
+    this.session = session;
+    this.entityPath = entityPath;
+    this.partitionId = partitionId;
+    this.destinationAddress = this.getDestinationAddress();
+  }
+  
+  public void send(byte[] data) throws EventHubException {
+    try {
+      if (this.sender == null) {
+        this.ensureSenderCreated();
+      }
+
+      Binary bin = new Binary(data);
+      Message message = new Message(new Data(bin));
+      this.sender.send(message);
+
+    } catch (LinkDetachedException e) {
+      logger.error(e.getMessage());
+
+      EventHubException eventHubException = new EventHubException("Sender has been closed");
+      throw eventHubException;
+    } catch (TimeoutException e) {
+      logger.error(e.getMessage());
+
+      EventHubException eventHubException = new EventHubException("Timed out while waiting to get credit to send");
+      throw eventHubException;
+    } catch (Exception e) {
+      logger.error(e.getMessage());
+    }
+  }
+
+  public void send(String data) throws EventHubException {
+    //For interop with other language, convert string to bytes
+    send(data.getBytes());
+  }
+
+  public void close() {
+    try {
+      this.sender.close();
+    } catch (Sender.SenderClosingException e) {
+      logger.error("Closing a sender encountered error: " + e.getMessage());
+    }
+  }
+
+  private String getDestinationAddress() {
+    if (this.partitionId == null || this.partitionId.equals("")) {
+      return this.entityPath;
+    } else {
+      return String.format(Constants.DestinationAddressFormatString, this.entityPath, this.partitionId);
+    }
+  }
+
+  private synchronized void ensureSenderCreated() throws Exception {
+    if (this.sender == null) {
+      this.sender = this.session.createSender(this.destinationAddress);
+    }
+  }
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventCount.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventCount.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventCount.java
index dd53e42..94fdb49 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventCount.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventCount.java
@@ -77,6 +77,7 @@ public class EventCount {
     if(enqueueTimeDiff != 0) {
       enqueueTimeFilter = System.currentTimeMillis() - enqueueTimeDiff*1000;
     }
+    String consumerGroupName = properties.getProperty("eventhubspout.consumer.group.name");
     
     System.out.println("Eventhub spout config: ");
     System.out.println("  partition count: " + partitionCount);
@@ -84,12 +85,14 @@ public class EventCount {
     System.out.println("  receiver credits: " + receiverCredits);
     spoutConfig = new EventHubSpoutConfig(username, password,
       namespaceName, entityPath, partitionCount, zkEndpointAddress,
-      checkpointIntervalInSeconds, receiverCredits, maxPendingMsgsPerPartition, enqueueTimeFilter);
+      checkpointIntervalInSeconds, receiverCredits, maxPendingMsgsPerPartition,
+      enqueueTimeFilter);
 
     if(targetFqnAddress != null)
     {
       spoutConfig.setTargetAddress(targetFqnAddress);      
     }
+    spoutConfig.setConsumerGroupName(consumerGroupName);
 
     //set the number of workers to be the same as partition number.
     //the idea is to have a spout and a partial count bolt co-exist in one

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java
index cae0573..c908f9d 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java
@@ -1,51 +1,52 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.samples;
-
-import backtype.storm.generated.StormTopology;
-import backtype.storm.topology.TopologyBuilder;
-
-import org.apache.storm.eventhubs.bolt.EventHubBolt;
-import org.apache.storm.eventhubs.spout.EventHubSpout;
-
-/**
- * A sample topology that loops message back to EventHub
- */
-public class EventHubLoop extends EventCount {
-
-  @Override
-  protected StormTopology buildTopology(EventHubSpout eventHubSpout) {
-    TopologyBuilder topologyBuilder = new TopologyBuilder();
-
-    topologyBuilder.setSpout("EventHubsSpout", eventHubSpout, spoutConfig.getPartitionCount())
-      .setNumTasks(spoutConfig.getPartitionCount());
-    
-    EventHubBolt eventHubBolt = new EventHubBolt(spoutConfig.getConnectionString(),
-        spoutConfig.getEntityPath());
-    //For every spout, let's create multiple bolts because send is much slower
-    int boltTasks = spoutConfig.getPartitionCount() * 50;
-    topologyBuilder.setBolt("EventHubsBolt", eventHubBolt, boltTasks)
-      .localOrShuffleGrouping("EventHubsSpout").setNumTasks(boltTasks);
-    return topologyBuilder.createTopology();
-  }
-  
-  public static void main(String[] args) throws Exception {
-    EventHubLoop scenario = new EventHubLoop();
-    scenario.runScenario(args);
-  }
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.samples;
+
+import backtype.storm.generated.StormTopology;
+import backtype.storm.topology.TopologyBuilder;
+
+import org.apache.storm.eventhubs.bolt.EventHubBolt;
+import org.apache.storm.eventhubs.bolt.EventHubBoltConfig;
+import org.apache.storm.eventhubs.spout.EventHubSpout;
+
+/**
+ * A sample topology that loops message back to EventHub
+ */
+public class EventHubLoop extends EventCount {
+
+  @Override
+  protected StormTopology buildTopology(EventHubSpout eventHubSpout) {
+    TopologyBuilder topologyBuilder = new TopologyBuilder();
+
+    topologyBuilder.setSpout("EventHubsSpout", eventHubSpout, spoutConfig.getPartitionCount())
+      .setNumTasks(spoutConfig.getPartitionCount());
+    EventHubBoltConfig boltConfig = new EventHubBoltConfig(spoutConfig.getConnectionString(),
+        spoutConfig.getEntityPath(), true);
+    
+    EventHubBolt eventHubBolt = new EventHubBolt(boltConfig);
+    int boltTasks = spoutConfig.getPartitionCount();
+    topologyBuilder.setBolt("EventHubsBolt", eventHubBolt, boltTasks)
+      .localOrShuffleGrouping("EventHubsSpout").setNumTasks(boltTasks);
+    return topologyBuilder.createTopology();
+  }
+  
+  public static void main(String[] args) throws Exception {
+    EventHubLoop scenario = new EventHubLoop();
+    scenario.runScenario(args);
+  }
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
index 5600873..68302af 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
@@ -46,6 +46,7 @@ public class EventHubReceiverImpl implements IEventHubReceiver {
   private final String entityName;
   private final String partitionId;
   private final int defaultCredits;
+  private final String consumerGroupName;
 
   private EventHubReceiver receiver;
   private String lastOffset = null;
@@ -58,6 +59,7 @@ public class EventHubReceiverImpl implements IEventHubReceiver {
     this.entityName = config.getEntityPath();
     this.defaultCredits = config.getReceiverCredits();
     this.partitionId = partitionId;
+    this.consumerGroupName = config.getConsumerGroupName();
     receiveApiLatencyMean = new ReducedMetric(new MeanReducer());
     receiveApiCallCount = new CountMetric();
     receiveMessageCount = new CountMetric();
@@ -70,14 +72,20 @@ public class EventHubReceiverImpl implements IEventHubReceiver {
     long start = System.currentTimeMillis();
     EventHubClient eventHubClient = EventHubClient.create(connectionString, entityName);
     if(filter.getOffset() != null) {
-      receiver = eventHubClient.getDefaultConsumerGroup().createReceiver(partitionId, filter.getOffset(), defaultCredits);
+      receiver = eventHubClient
+          .getConsumerGroup(consumerGroupName)
+          .createReceiver(partitionId, filter.getOffset(), defaultCredits);
     }
     else if(filter.getEnqueueTime() != 0) {
-      receiver = eventHubClient.getDefaultConsumerGroup().createReceiver(partitionId, filter.getEnqueueTime(), defaultCredits);
+      receiver = eventHubClient
+          .getConsumerGroup(consumerGroupName)
+          .createReceiver(partitionId, filter.getEnqueueTime(), defaultCredits);
     }
     else {
       logger.error("Invalid IEventHubReceiverFilter, use default offset as filter");
-      receiver = eventHubClient.getDefaultConsumerGroup().createReceiver(partitionId, Constants.DefaultStartingOffset, defaultCredits);
+      receiver = eventHubClient
+          .getConsumerGroup(consumerGroupName)
+          .createReceiver(partitionId, Constants.DefaultStartingOffset, defaultCredits);
     }
     long end = System.currentTimeMillis();
     logger.info("created eventhub receiver, time taken(ms): " + (end-start));
@@ -107,6 +115,12 @@ public class EventHubReceiverImpl implements IEventHubReceiver {
     receiveApiCallCount.incr();
 
     if (message == null) {
+      //Temporary workaround for AMQP/EH bug of failing to receive messages
+      if(timeoutInMilliseconds > 100 && millis < timeoutInMilliseconds/2) {
+        throw new RuntimeException(
+            "Restart EventHubSpout due to failure of receiving messages in "
+            + millis + " millisecond");
+      }
       return null;
     }
     receiveMessageCount.incr();

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpout.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpout.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpout.java
index 9290e6e..d08ec3a 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpout.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpout.java
@@ -50,6 +50,11 @@ public class EventHubSpout extends BaseRichSpout {
   private long lastCheckpointTime;
   private int currentPartitionIndex = -1;
 
+  public EventHubSpout(String username, String password, String namespace,
+      String entityPath, int partitionCount) {
+    this(new EventHubSpoutConfig(username, password, namespace, entityPath, partitionCount));
+  }
+
   public EventHubSpout(EventHubSpoutConfig spoutConfig) {
     this(spoutConfig, null, null, null);
   }

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java
index ae11680..0238e40 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java
@@ -24,28 +24,41 @@ import java.util.ArrayList;
 import java.util.List;
 
 public class EventHubSpoutConfig implements Serializable {
+  private static final long serialVersionUID = 1L; 
 
-  private static final long serialVersionUID = 1L;
+  public static final String EH_SERVICE_FQDN_SUFFIX = "servicebus.windows.net";
   private final String userName;
   private final String password;
   private final String namespace;
   private final String entityPath;
-  private final String zkConnectionString;
   private final int partitionCount;
-  private final int checkpointIntervalInSeconds;
-  private final int receiverCredits;
-  private final int maxPendingMsgsPerPartition;
-  private final long enqueueTimeFilter; //timestamp in millisecond
 
+  private String zkConnectionString = null; //if null then use zookeeper used by Storm
+  private int checkpointIntervalInSeconds = 10;
+  private int receiverCredits = 1024;
+  private int maxPendingMsgsPerPartition = 1024;
+  private long enqueueTimeFilter = 0; //timestamp in millisecond, 0 means disabling filter
   private String connectionString;
-  private String targetFqnAddress;
   private String topologyName;
-  private IEventDataScheme scheme;
+  private IEventDataScheme scheme = new EventDataScheme();
+  private String consumerGroupName = null; //if null then use default consumer group
 
+  //These are mandatory parameters
+  public EventHubSpoutConfig(String username, String password, String namespace,
+      String entityPath, int partitionCount) {
+    this.userName = username;
+    this.password = password;
+    this.connectionString = buildConnectionString(username, password, namespace);
+    this.namespace = namespace;
+    this.entityPath = entityPath;
+    this.partitionCount = partitionCount;
+  }
+
+  //Keep this constructor for backward compatibility
   public EventHubSpoutConfig(String username, String password, String namespace,
       String entityPath, int partitionCount, String zkConnectionString) {
-    this(username, password, namespace, entityPath, partitionCount,
-        zkConnectionString, 10, 1024, 1024, 0);
+    this(username, password, namespace, entityPath, partitionCount);
+    setZkConnectionString(zkConnectionString);
   }
   
   //Keep this constructor for backward compatibility
@@ -53,28 +66,20 @@ public class EventHubSpoutConfig implements Serializable {
       String entityPath, int partitionCount, String zkConnectionString,
       int checkpointIntervalInSeconds, int receiverCredits) {
     this(username, password, namespace, entityPath, partitionCount,
-        zkConnectionString, checkpointIntervalInSeconds, receiverCredits, 1024, 0);
+        zkConnectionString);
+    setCheckpointIntervalInSeconds(checkpointIntervalInSeconds);
+    setReceiverCredits(receiverCredits);
   }
-      
+
+  //Keep this constructor for backward compatibility
   public EventHubSpoutConfig(String username, String password, String namespace,
     String entityPath, int partitionCount, String zkConnectionString,
     int checkpointIntervalInSeconds, int receiverCredits, int maxPendingMsgsPerPartition, long enqueueTimeFilter) {
-    this.userName = username;
-    this.password = password;
-    this.connectionString = buildConnectionString(username, password, namespace);
-    this.namespace = namespace;
-    this.entityPath = entityPath;
-    this.partitionCount = partitionCount;
-    this.zkConnectionString = zkConnectionString;
-    this.checkpointIntervalInSeconds = checkpointIntervalInSeconds;
-    this.receiverCredits = receiverCredits;
-    this.maxPendingMsgsPerPartition = maxPendingMsgsPerPartition;
-    this.enqueueTimeFilter = enqueueTimeFilter;
-    this.scheme = new EventDataScheme();
-  }
-
-  public String getConnectionString() {
-    return connectionString;
+    
+    this(username, password, namespace, entityPath, partitionCount,
+        zkConnectionString, checkpointIntervalInSeconds, receiverCredits);
+    setMaxPendingMsgsPerPartition(maxPendingMsgsPerPartition);
+    setEnqueueTimeFilter(enqueueTimeFilter);
   }
 
   public String getNamespace() {
@@ -85,30 +90,50 @@ public class EventHubSpoutConfig implements Serializable {
     return entityPath;
   }
 
+  public int getPartitionCount() {
+    return partitionCount;
+  }
+
   public String getZkConnectionString() {
     return zkConnectionString;
   }
 
+  public void setZkConnectionString(String value) {
+    zkConnectionString = value;
+  }
+
   public int getCheckpointIntervalInSeconds() {
     return checkpointIntervalInSeconds;
   }
 
-  public int getPartitionCount() {
-    return partitionCount;
+  public void setCheckpointIntervalInSeconds(int value) {
+    checkpointIntervalInSeconds = value;
   }
   
   public int getReceiverCredits() {
     return receiverCredits;
   }
+
+  public void setReceiverCredits(int value) {
+    receiverCredits = value;
+  }
   
   public int getMaxPendingMsgsPerPartition() {
     return maxPendingMsgsPerPartition;
   }
+
+  public void setMaxPendingMsgsPerPartition(int value) {
+    maxPendingMsgsPerPartition = value;
+  }
   
   public long getEnqueueTimeFilter() {
     return enqueueTimeFilter;
   }
 
+  public void setEnqueueTimeFilter(long value) {
+    enqueueTimeFilter = value;
+  }
+
   public String getTopologyName() {
     return topologyName;
   }
@@ -125,6 +150,14 @@ public class EventHubSpoutConfig implements Serializable {
     this.scheme = scheme;
   }
 
+  public String getConsumerGroupName() {
+    return consumerGroupName;
+  }
+
+  public void setConsumerGroupName(String value) {
+    consumerGroupName = value;
+  }
+
   public List<String> getPartitionList() {
     List<String> partitionList = new ArrayList<String>();
 
@@ -134,16 +167,18 @@ public class EventHubSpoutConfig implements Serializable {
 
     return partitionList;
   }
-  
+
+  public String getConnectionString() {
+    return connectionString;
+  }
+
   public void setTargetAddress(String targetFqnAddress) {
-    this.targetFqnAddress = targetFqnAddress;
     this.connectionString = buildConnectionString(
-        this.userName, this.password, this.namespace, this.targetFqnAddress);
+        userName, password, namespace, targetFqnAddress);
   }
 
   public static String buildConnectionString(String username, String password, String namespace) {
-    String targetFqnAddress = "servicebus.windows.net";
-    return buildConnectionString(username, password, namespace, targetFqnAddress);
+    return buildConnectionString(username, password, namespace, EH_SERVICE_FQDN_SUFFIX);
   }
 
   public static String buildConnectionString(String username, String password,

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/main/resources/config.properties
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/resources/config.properties b/external/storm-eventhubs/src/main/resources/config.properties
index 82abb48..a8a520e 100755
--- a/external/storm-eventhubs/src/main/resources/config.properties
+++ b/external/storm-eventhubs/src/main/resources/config.properties
@@ -24,4 +24,7 @@ eventhubspout.max.pending.messages.per.partition = 1024
 # the EventHubs entity when we first create the Storm topology. If offsets
 # have been saved in Zookeeper, we'll ignore this configuration.
 # A value of 0 means disable time based filtering when creating the receiver.
-eventhub.receiver.filter.timediff = 0
\ No newline at end of file
+eventhub.receiver.filter.timediff = 0
+
+# Uncomment to specify consumer group name here, or use the default
+#eventhubspout.consumer.group.name = yourconsumergroupname
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/e5154928/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/TestEventHubSpout.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/TestEventHubSpout.java b/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/TestEventHubSpout.java
index 6a0d163..49e544b 100755
--- a/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/TestEventHubSpout.java
+++ b/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/TestEventHubSpout.java
@@ -37,7 +37,9 @@ public class TestEventHubSpout {
   @Test
   public void testSpoutConfig() {
     EventHubSpoutConfig conf = new EventHubSpoutConfig("username", "pas\\s+w/ord",
-        "namespace", "entityname", 16, "zookeeper");
+        "namespace", "entityname", 16);
+    conf.setZkConnectionString("zookeeper");
+    conf.setCheckpointIntervalInSeconds(1);
     assertEquals(conf.getConnectionString(), "amqps://username:pas%5Cs%2Bw%2Ford@namespace.servicebus.windows.net");
   }
 


[03/50] [abbrv] storm git commit: [maven-release-plugin] prepare for next development iteration

Posted by pt...@apache.org.
[maven-release-plugin] prepare for next development iteration


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/df349303
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/df349303
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/df349303

Branch: refs/heads/0.10.x-branch
Commit: df349303e1e2635ac4e6bcab5d872348040a36f4
Parents: 54f5fb7
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Apr 1 13:27:35 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Apr 1 13:27:35 2015 -0400

----------------------------------------------------------------------
 flux-core/pom.xml     | 2 +-
 flux-examples/pom.xml | 2 +-
 flux-wrappers/pom.xml | 2 +-
 pom.xml               | 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/df349303/flux-core/pom.xml
----------------------------------------------------------------------
diff --git a/flux-core/pom.xml b/flux-core/pom.xml
index 1ea353a..0d72ead 100644
--- a/flux-core/pom.xml
+++ b/flux-core/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.2.2</version>
+        <version>0.2.3-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/df349303/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/flux-examples/pom.xml b/flux-examples/pom.xml
index ac19e41..29a2e62 100644
--- a/flux-examples/pom.xml
+++ b/flux-examples/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.2.2</version>
+        <version>0.2.3-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/df349303/flux-wrappers/pom.xml
----------------------------------------------------------------------
diff --git a/flux-wrappers/pom.xml b/flux-wrappers/pom.xml
index 7884af2..e571d56 100644
--- a/flux-wrappers/pom.xml
+++ b/flux-wrappers/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.2.2</version>
+        <version>0.2.3-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/df349303/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index b06103b..e79e25a 100644
--- a/pom.xml
+++ b/pom.xml
@@ -20,7 +20,7 @@
 
     <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux</artifactId>
-    <version>0.2.2</version>
+    <version>0.2.3-SNAPSHOT</version>
     <packaging>pom</packaging>
     <name>flux</name>
     <url>https://github.com/ptgoetz/flux</url>


[10/50] [abbrv] storm git commit: add HBase example

Posted by pt...@apache.org.
add HBase example


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/4a1db96f
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/4a1db96f
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/4a1db96f

Branch: refs/heads/0.10.x-branch
Commit: 4a1db96fc38fb438c3d4433381e719ca16d63bc8
Parents: a791604
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue Apr 7 23:23:05 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue Apr 7 23:23:05 2015 -0400

----------------------------------------------------------------------
 flux-core/pom.xml                               |   6 +
 .../java/org/apache/storm/flux/TCKTest.java     |  10 ++
 .../test/resources/configs/simple_hbase.yaml    | 120 +++++++++++++++++++
 flux-examples/pom.xml                           |   5 +
 .../storm/flux/examples/WordCountClient.java    |  63 ++++++++++
 .../apache/storm/flux/examples/WordCounter.java |  71 +++++++++++
 flux-examples/src/main/resources/hbase-site.xml |  36 ++++++
 .../src/main/resources/simple_hbase.yaml        |  91 ++++++++++++++
 8 files changed, 402 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/4a1db96f/flux-core/pom.xml
----------------------------------------------------------------------
diff --git a/flux-core/pom.xml b/flux-core/pom.xml
index 0d72ead..fe2e301 100644
--- a/flux-core/pom.xml
+++ b/flux-core/pom.xml
@@ -50,6 +50,12 @@
             <version>${storm.version}</version>
             <scope>test</scope>
         </dependency>
+        <dependency>
+            <groupId>org.apache.storm</groupId>
+            <artifactId>storm-hbase</artifactId>
+            <version>0.11.0-SNAPSHOT</version>
+            <scope>test</scope>
+        </dependency>
     </dependencies>
     <build>
         <resources>

http://git-wip-us.apache.org/repos/asf/storm/blob/4a1db96f/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
----------------------------------------------------------------------
diff --git a/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java b/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
index 6580ef7..27abfbe 100644
--- a/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
+++ b/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
@@ -82,6 +82,16 @@ public class TCKTest {
     }
 
     @Test
+    public void testHbase() throws Exception {
+        TopologyDef topologyDef = FluxParser.parseResource("/configs/simple_hbase.yaml", false, true, null, false);
+        Config conf = FluxBuilder.buildConfig(topologyDef);
+        ExecutionContext context = new ExecutionContext(topologyDef, conf);
+        StormTopology topology = FluxBuilder.buildTopology(context);
+        assertNotNull(topology);
+        topology.validate();
+    }
+
+    @Test
     public void testIncludes() throws Exception {
         TopologyDef topologyDef = FluxParser.parseResource("/configs/include_test.yaml", false, true, null, false);
         Config conf = FluxBuilder.buildConfig(topologyDef);

http://git-wip-us.apache.org/repos/asf/storm/blob/4a1db96f/flux-core/src/test/resources/configs/simple_hbase.yaml
----------------------------------------------------------------------
diff --git a/flux-core/src/test/resources/configs/simple_hbase.yaml b/flux-core/src/test/resources/configs/simple_hbase.yaml
new file mode 100644
index 0000000..e407bd9
--- /dev/null
+++ b/flux-core/src/test/resources/configs/simple_hbase.yaml
@@ -0,0 +1,120 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Test ability to wire together shell spouts/bolts
+---
+
+# topology definition
+# name to be used when submitting
+name: "hbase-wordcount"
+
+# Components
+# Components are analagous to Spring beans. They are meant to be used as constructor,
+# property(setter), and builder arguments.
+#
+# for the time being, components must be declared in the order they are referenced
+
+#        WordSpout spout = new WordSpout();
+#        WordCounter bolt = new WordCounter();
+#
+#        SimpleHBaseMapper mapper = new SimpleHBaseMapper()
+#                .withRowKeyField("word")
+#                .withColumnFields(new Fields("word"))
+#                .withCounterFields(new Fields("count"))
+#                .withColumnFamily("cf");
+#
+#        HBaseBolt hbase = new HBaseBolt("WordCount", mapper)
+#                .withConfigKey("hbase.conf");
+#
+#
+#        // wordSpout ==> countBolt ==> HBaseBolt
+#        TopologyBuilder builder = new TopologyBuilder();
+#
+#        builder.setSpout(WORD_SPOUT, spout, 1);
+#        builder.setBolt(COUNT_BOLT, bolt, 1).shuffleGrouping(WORD_SPOUT);
+#        builder.setBolt(HBASE_BOLT, hbase, 1).fieldsGrouping(COUNT_BOLT, new Fields("word"));
+
+
+
+
+components:
+  - id: "columnFields"
+    className: "backtype.storm.tuple.Fields"
+    constructorArgs:
+      - ["word"]
+
+  - id: "counterFields"
+    className: "backtype.storm.tuple.Fields"
+    constructorArgs:
+      - ["count"]
+
+  - id: "mapper"
+    className: "org.apache.storm.hbase.bolt.mapper.SimpleHBaseMapper"
+    configMethods:
+      - name: "withRowKeyField"
+        args: ["word"]
+      - name: "withColumnFields"
+        args: [ref: "columnFields"]
+      - name: "withCounterFields"
+        args: [ref: "counterFields"]
+      - name: "withColumnFamily"
+        args: ["cf"]
+
+# topology configuration
+# this will be passed to the submitter as a map of config options
+#
+config:
+  topology.workers: 1
+  hbase.conf:
+    hbase.rootdir: "hdfs://hadoop:54310/hbase"
+    hbase.zookeeper.quorum: "hadoop"
+
+# spout definitions
+spouts:
+  - id: "word-spout"
+    className: "backtype.storm.testing.TestWordSpout"
+    parallelism: 1
+
+# bolt definitions
+
+bolts:
+  - id: "count-bolt"
+    className: "backtype.storm.testing.TestWordCounter"
+
+  - id: "hbase-bolt"
+    className: "org.apache.storm.hbase.bolt.HBaseBolt"
+    constructorArgs:
+      - "WordCount" # HBase table name
+      - ref: "mapper"
+    configMethods:
+      - name: "withConfigKey"
+        args: ["hbase.conf"]
+    parallelism: 1
+
+
+streams:
+  - name: "" # name isn't used (placeholder for logging, UI, etc.)
+    from: "word-spout"
+    to: "count-bolt"
+    grouping:
+      type: SHUFFLE
+
+  - name: "" # name isn't used (placeholder for logging, UI, etc.)
+    from: "count-bolt"
+    to: "hbase-bolt"
+    grouping:
+      type: FIELDS
+      args: ["word"]
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/4a1db96f/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/flux-examples/pom.xml b/flux-examples/pom.xml
index 09db717..2321074 100644
--- a/flux-examples/pom.xml
+++ b/flux-examples/pom.xml
@@ -50,6 +50,11 @@
             <artifactId>storm-hdfs</artifactId>
             <version>${storm.version}</version>
         </dependency>
+        <dependency>
+            <groupId>org.apache.storm</groupId>
+            <artifactId>storm-hbase</artifactId>
+            <version>${storm.version}</version>
+        </dependency>
     </dependencies>
 
     <build>

http://git-wip-us.apache.org/repos/asf/storm/blob/4a1db96f/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java b/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
new file mode 100644
index 0000000..55873d5
--- /dev/null
+++ b/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
@@ -0,0 +1,63 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.storm.flux.examples;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hbase.HBaseConfiguration;
+import org.apache.hadoop.hbase.client.Get;
+import org.apache.hadoop.hbase.client.HTable;
+import org.apache.hadoop.hbase.client.Result;
+import org.apache.hadoop.hbase.util.Bytes;
+
+/**
+ * Connects to the 'WordCount' HBase table and prints counts for each word.
+ *
+ * Assumes you have run (or are running) the YAML topology definition in
+ * <code>simple_hbase.yaml</code>
+ *
+ * You will also need to modify `src/main/resources/hbase-site.xml`
+ * to point to your HBase instance, and then repackage with `mvn package`.
+ * This is a known issue.
+ *
+ */
+public class WordCountClient {
+
+    public static void main(String[] args) throws Exception {
+        Configuration config = HBaseConfiguration.create();
+        if(args.length > 0){
+            config.set("hbase.rootdir", args[0]);
+        }
+
+        HTable table = new HTable(config, "WordCount");
+        String[] words = new String[] {"nathan", "mike", "jackson", "golda", "bertels"};
+
+        for (String word : words) {
+            Get get = new Get(Bytes.toBytes(word));
+            Result result = table.get(get);
+
+            byte[] countBytes = result.getValue(Bytes.toBytes("cf"), Bytes.toBytes("count"));
+            byte[] wordBytes = result.getValue(Bytes.toBytes("cf"), Bytes.toBytes("word"));
+
+            String wordStr = Bytes.toString(wordBytes);
+            System.out.println(wordStr);
+            long count = Bytes.toLong(countBytes);
+            System.out.println("Word: '" + wordStr + "', Count: " + count);
+        }
+
+    }
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/4a1db96f/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCounter.java
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCounter.java b/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCounter.java
new file mode 100644
index 0000000..f7c80c7
--- /dev/null
+++ b/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCounter.java
@@ -0,0 +1,71 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.storm.flux.examples;
+
+import backtype.storm.task.TopologyContext;
+import backtype.storm.topology.BasicOutputCollector;
+import backtype.storm.topology.IBasicBolt;
+import backtype.storm.topology.OutputFieldsDeclarer;
+import backtype.storm.topology.base.BaseBasicBolt;
+import backtype.storm.tuple.Fields;
+import backtype.storm.tuple.Tuple;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.util.Map;
+
+import static backtype.storm.utils.Utils.tuple;
+
+/**
+ * This bolt is used by the HBase example. It simply emits the first field
+ * found in the incoming tuple as "word", with a "count" of `1`.
+ *
+ * In this case, the downstream HBase bolt handles the counting, so a value
+ * of `1` will just increment the HBase counter by one.
+ */
+public class WordCounter extends BaseBasicBolt {
+    private static final Logger LOG = LoggerFactory.getLogger(WordCounter.class);
+
+
+
+    @SuppressWarnings("rawtypes")
+    public void prepare(Map stormConf, TopologyContext context) {
+    }
+
+    /*
+     * Just output the word value with a count of 1.
+     * The HBaseBolt will handle incrementing the counter.
+     */
+    public void execute(Tuple input, BasicOutputCollector collector) {
+        collector.emit(tuple(input.getValues().get(0), 1));
+    }
+
+    public void cleanup() {
+
+    }
+
+    public void declareOutputFields(OutputFieldsDeclarer declarer) {
+        declarer.declare(new Fields("word", "count"));
+    }
+
+    @Override
+    public Map<String, Object> getComponentConfiguration() {
+        return null;
+    }
+
+}
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/4a1db96f/flux-examples/src/main/resources/hbase-site.xml
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/hbase-site.xml b/flux-examples/src/main/resources/hbase-site.xml
new file mode 100644
index 0000000..06c3031
--- /dev/null
+++ b/flux-examples/src/main/resources/hbase-site.xml
@@ -0,0 +1,36 @@
+<?xml version="1.0"?>
+<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
+<!--
+/**
+ *
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+-->
+<configuration>
+	<property>
+	  <name>hbase.cluster.distributed</name>
+	  <value>true</value>
+	</property>
+	<property>
+	  <name>hbase.rootdir</name>
+	  <value>hdfs://hadoop:54310/hbase</value>
+	</property>
+	<property>
+	  <name>hbase.zookeeper.quorum</name>
+	  <value>hadoop</value>
+	</property>
+</configuration>
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/4a1db96f/flux-examples/src/main/resources/simple_hbase.yaml
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/simple_hbase.yaml b/flux-examples/src/main/resources/simple_hbase.yaml
new file mode 100644
index 0000000..5eb70ed
--- /dev/null
+++ b/flux-examples/src/main/resources/simple_hbase.yaml
@@ -0,0 +1,91 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+---
+# NOTE: To use this example, you will need to modify `src/main/resources/hbase-site.xml`
+# to point to your HBase instance, and then repackage with `mvn package`.
+# This is a known issue.
+
+# topology definition
+# name to be used when submitting
+name: "hbase-persistent-wordcount"
+
+# Components
+components:
+  - id: "columnFields"
+    className: "backtype.storm.tuple.Fields"
+    constructorArgs:
+      - ["word"]
+
+  - id: "counterFields"
+    className: "backtype.storm.tuple.Fields"
+    constructorArgs:
+      - ["count"]
+
+  - id: "mapper"
+    className: "org.apache.storm.hbase.bolt.mapper.SimpleHBaseMapper"
+    configMethods:
+      - name: "withRowKeyField"
+        args: ["word"]
+      - name: "withColumnFields"
+        args: [ref: "columnFields"]
+      - name: "withCounterFields"
+        args: [ref: "counterFields"]
+      - name: "withColumnFamily"
+        args: ["cf"]
+
+# topology configuration
+# this will be passed to the submitter as a map of config options
+config:
+  topology.workers: 1
+  hbase.conf:
+    hbase.rootdir: "hdfs://hadoop:54310/hbase"
+
+# spout definitions
+spouts:
+  - id: "word-spout"
+    className: "backtype.storm.testing.TestWordSpout"
+    parallelism: 1
+
+# bolt definitions
+
+bolts:
+  - id: "count-bolt"
+    className: "org.apache.storm.flux.examples.WordCounter"
+    parallelism: 1
+
+  - id: "hbase-bolt"
+    className: "org.apache.storm.hbase.bolt.HBaseBolt"
+    constructorArgs:
+      - "WordCount" # HBase table name
+      - ref: "mapper"
+    configMethods:
+      - name: "withConfigKey"
+        args: ["hbase.conf"]
+    parallelism: 1
+
+streams:
+  - name: "" # name isn't used (placeholder for logging, UI, etc.)
+    from: "word-spout"
+    to: "count-bolt"
+    grouping:
+      type: SHUFFLE
+
+  - name: "" # name isn't used (placeholder for logging, UI, etc.)
+    from: "count-bolt"
+    to: "hbase-bolt"
+    grouping:
+      type: FIELDS
+      args: ["word"]


[21/50] [abbrv] storm git commit: [maven-release-plugin] prepare release flux-0.3.0

Posted by pt...@apache.org.
[maven-release-plugin] prepare release flux-0.3.0


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/20a30116
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/20a30116
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/20a30116

Branch: refs/heads/0.10.x-branch
Commit: 20a30116935eb3bcc4a255c1617bcb8e50331c5e
Parents: 8e0f167
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue May 5 16:54:01 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue May 5 16:54:01 2015 -0400

----------------------------------------------------------------------
 flux-core/pom.xml     |  2 +-
 flux-examples/pom.xml | 11 ++++-------
 flux-wrappers/pom.xml |  2 +-
 pom.xml               |  2 +-
 4 files changed, 7 insertions(+), 10 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/20a30116/flux-core/pom.xml
----------------------------------------------------------------------
diff --git a/flux-core/pom.xml b/flux-core/pom.xml
index 2d03ea4..12312f5 100644
--- a/flux-core/pom.xml
+++ b/flux-core/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.2.3-SNAPSHOT</version>
+        <version>0.3.0</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/20a30116/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/flux-examples/pom.xml b/flux-examples/pom.xml
index 2321074..e186b9c 100644
--- a/flux-examples/pom.xml
+++ b/flux-examples/pom.xml
@@ -15,14 +15,13 @@
  See the License for the specific language governing permissions and
  limitations under the License.
 -->
-<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
-         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
     <modelVersion>4.0.0</modelVersion>
 
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.2.3-SNAPSHOT</version>
+        <version>0.3.0</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 
@@ -74,10 +73,8 @@
                         </goals>
                         <configuration>
                             <transformers>
-                                <transformer
-                                        implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
-                                <transformer
-                                        implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
+                                <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
+                                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                     <mainClass>org.apache.storm.flux.Flux</mainClass>
                                 </transformer>
                             </transformers>

http://git-wip-us.apache.org/repos/asf/storm/blob/20a30116/flux-wrappers/pom.xml
----------------------------------------------------------------------
diff --git a/flux-wrappers/pom.xml b/flux-wrappers/pom.xml
index e571d56..532da15 100644
--- a/flux-wrappers/pom.xml
+++ b/flux-wrappers/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.2.3-SNAPSHOT</version>
+        <version>0.3.0</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/20a30116/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index de48f7b..942cbeb 100644
--- a/pom.xml
+++ b/pom.xml
@@ -20,7 +20,7 @@
 
     <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux</artifactId>
-    <version>0.2.3-SNAPSHOT</version>
+    <version>0.3.0</version>
     <packaging>pom</packaging>
     <name>flux</name>
     <url>https://github.com/ptgoetz/flux</url>


[36/50] [abbrv] storm git commit: update EventHubClient library version to 0.9.1

Posted by pt...@apache.org.
update EventHubClient library version to 0.9.1

Signed-off-by: Shanyu Zhao <sh...@microsoft.com>


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/86f326a3
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/86f326a3
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/86f326a3

Branch: refs/heads/0.10.x-branch
Commit: 86f326a3c0b40fc1b0eab65fcb17b438e433cfd6
Parents: 85aeb3d
Author: Shanyu Zhao <sh...@microsoft.com>
Authored: Fri May 29 14:05:28 2015 -0700
Committer: Shanyu Zhao <sh...@microsoft.com>
Committed: Fri May 29 14:05:28 2015 -0700

----------------------------------------------------------------------
 external/storm-eventhubs/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/86f326a3/external/storm-eventhubs/pom.xml
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/pom.xml b/external/storm-eventhubs/pom.xml
index 6d4a47b..5de412f 100755
--- a/external/storm-eventhubs/pom.xml
+++ b/external/storm-eventhubs/pom.xml
@@ -33,7 +33,7 @@
 
     <properties>
         <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
-        <eventhubs.client.version>0.9</eventhubs.client.version>
+        <eventhubs.client.version>0.9.1</eventhubs.client.version>
     </properties>
     <build>
         <plugins>


[47/50] [abbrv] storm git commit: add missing license headers and clean up RAT report

Posted by pt...@apache.org.
http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountStoreMapper.java
----------------------------------------------------------------------
diff --git a/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountStoreMapper.java b/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountStoreMapper.java
index 6521302..b930998 100644
--- a/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountStoreMapper.java
+++ b/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountStoreMapper.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.redis.trident;
 
 import backtype.storm.tuple.ITuple;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index 74f539e..abae7fb 100644
--- a/pom.xml
+++ b/pom.xml
@@ -762,9 +762,17 @@
                         <exclude>**/src/py/**</exclude>
 
                         <!-- the following are in the LICENSE file -->
-                        <exclude>**/src/ui/public/js/jquery-1.6.2.min.js</exclude>
+                        <exclude>**/src/ui/public/js/jquery.dataTables.1.10.4.min.js</exclude>
+                        <exclude>**/src/ui/public/css/jquery.dataTables.1.10.4.min.css</exclude>
+                        <exclude>**/src/ui/public/images/*</exclude>
+                        <exclude>**/src/ui/public/js/bootstrap-3.3.1.min.js</exclude>
+                        <exclude>**/src/ui/public/css/bootstrap-3.3.1.min.css</exclude>
+                        <exclude>**/src/ui/public/js/dataTables.bootstrap.min.js</exclude>
+                        <exclude>**/src/ui/public/css/dataTables.bootstrap.css</exclude>
+                        <exclude>**/src/ui/public/js/jsonFormatter.min.js</exclude>
+                        <exclude>**/src/ui/public/css/jsonFormatter.min.css</exclude>
+                        <exclude>**/src/ui/public/js/jquery-1.11.1.min.js</exclude>
                         <exclude>**/src/ui/public/js/jquery.cookies.2.2.0.min.js</exclude>
-                        <exclude>**/src/ui/public/js/jquery.tablesorter.min.js</exclude>
                         <exclude>**/src/ui/public/js/moment.min.js</exclude>
                         <exclude>**/src/ui/public/js/jquery.blockUI.min.js</exclude>
                         <exclude>**/src/ui/public/js/url.min.js</exclude>
@@ -772,10 +780,13 @@
                         <exclude>**/src/ui/public/js/arbor-graphics.js</exclude>
                         <exclude>**/src/ui/public/js/arbor-tween.js</exclude>
                         <exclude>**/src/ui/public/js/jquery.mustache.js</exclude>
-                        <exclude>**/src/ui/public/js/purl.js</exclude>
 
                         <!-- generated by shade plugin -->
                         <exclude>**/dependency-reduced-pom.xml</exclude>
+
+                        <exclude>**/docs/**</exclude>
+                        <exclude>**/.git/**</exclude>
+                        <exclude>**/derby.log</exclude>
                     </excludes>
                 </configuration>
             </plugin>

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/src/clj/backtype/storm/converter.clj
----------------------------------------------------------------------
diff --git a/storm-core/src/clj/backtype/storm/converter.clj b/storm-core/src/clj/backtype/storm/converter.clj
index 45e3033..7bfc14b 100644
--- a/storm-core/src/clj/backtype/storm/converter.clj
+++ b/storm-core/src/clj/backtype/storm/converter.clj
@@ -1,3 +1,18 @@
+;; Licensed to the Apache Software Foundation (ASF) under one
+;; or more contributor license agreements.  See the NOTICE file
+;; distributed with this work for additional information
+;; regarding copyright ownership.  The ASF licenses this file
+;; to you under the Apache License, Version 2.0 (the
+;; "License"); you may not use this file except in compliance
+;; with the License.  You may obtain a copy of the License at
+;;
+;; http://www.apache.org/licenses/LICENSE-2.0
+;;
+;; Unless required by applicable law or agreed to in writing, software
+;; distributed under the License is distributed on an "AS IS" BASIS,
+;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+;; See the License for the specific language governing permissions and
+;; limitations under the License.
 (ns backtype.storm.converter
   (:import [backtype.storm.generated SupervisorInfo NodeInfo Assignment
             StormBase TopologyStatus ClusterWorkerHeartbeat ExecutorInfo ErrorInfo Credentials RebalanceOptions KillOptions TopologyActionOptions])

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/src/dev/drpc-simple-acl-test-scenario.yaml
----------------------------------------------------------------------
diff --git a/storm-core/src/dev/drpc-simple-acl-test-scenario.yaml b/storm-core/src/dev/drpc-simple-acl-test-scenario.yaml
index 82c03c0..b72b026 100644
--- a/storm-core/src/dev/drpc-simple-acl-test-scenario.yaml
+++ b/storm-core/src/dev/drpc-simple-acl-test-scenario.yaml
@@ -1,3 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
 # For the function "jump", alice can perform client operations, and bob can
 # perform invocation operations.
 drpc.authorizer.acl:

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/src/jvm/backtype/storm/messaging/ConnectionWithStatus.java
----------------------------------------------------------------------
diff --git a/storm-core/src/jvm/backtype/storm/messaging/ConnectionWithStatus.java b/storm-core/src/jvm/backtype/storm/messaging/ConnectionWithStatus.java
index 38abc19..37981ca 100644
--- a/storm-core/src/jvm/backtype/storm/messaging/ConnectionWithStatus.java
+++ b/storm-core/src/jvm/backtype/storm/messaging/ConnectionWithStatus.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package backtype.storm.messaging;
 
 public abstract class ConnectionWithStatus implements IConnection {

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCAuthorizerBase.java
----------------------------------------------------------------------
diff --git a/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCAuthorizerBase.java b/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCAuthorizerBase.java
index 8951edd..e1bb077 100644
--- a/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCAuthorizerBase.java
+++ b/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCAuthorizerBase.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package backtype.storm.security.auth.authorizer;
 
 import java.util.Map;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer.java
----------------------------------------------------------------------
diff --git a/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer.java b/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer.java
index 45eaea5..d747502 100644
--- a/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer.java
+++ b/storm-core/src/jvm/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer.java
@@ -1,3 +1,21 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
 package backtype.storm.security.auth.authorizer;
 
 import java.lang.reflect.Field;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/src/jvm/backtype/storm/security/auth/authorizer/ImpersonationAuthorizer.java
----------------------------------------------------------------------
diff --git a/storm-core/src/jvm/backtype/storm/security/auth/authorizer/ImpersonationAuthorizer.java b/storm-core/src/jvm/backtype/storm/security/auth/authorizer/ImpersonationAuthorizer.java
index d6431be..df9e83a 100644
--- a/storm-core/src/jvm/backtype/storm/security/auth/authorizer/ImpersonationAuthorizer.java
+++ b/storm-core/src/jvm/backtype/storm/security/auth/authorizer/ImpersonationAuthorizer.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package backtype.storm.security.auth.authorizer;
 
 import backtype.storm.Config;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_cluster.conf
----------------------------------------------------------------------
diff --git a/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_cluster.conf b/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_cluster.conf
index 92a1399..e8ea8b0 100644
--- a/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_cluster.conf
+++ b/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_cluster.conf
@@ -1,4 +1,22 @@
-/* 
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+/*
 This is a sample JAAS configuration for Storm servers to handle Kerberos authentication
 */
 

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_launcher.conf
----------------------------------------------------------------------
diff --git a/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_launcher.conf b/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_launcher.conf
index 138e1f3..2a7e029 100644
--- a/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_launcher.conf
+++ b/storm-core/src/jvm/backtype/storm/security/auth/kerberos/jaas_kerberos_launcher.conf
@@ -1,4 +1,23 @@
 /*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+
+/*
  This is a sample JAAS configuration for Storm topology launcher/submitter.
  Since launcher machines are typically accessible by many folks, we 
  encourage you to leverage "kinit", instead of keytab.  

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/src/native/worker-launcher/.deps/worker-launcher.Po
----------------------------------------------------------------------
diff --git a/storm-core/src/native/worker-launcher/.deps/worker-launcher.Po b/storm-core/src/native/worker-launcher/.deps/worker-launcher.Po
index 9ce06a8..1bc5de9 100644
--- a/storm-core/src/native/worker-launcher/.deps/worker-launcher.Po
+++ b/storm-core/src/native/worker-launcher/.deps/worker-launcher.Po
@@ -1 +1,17 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
 # dummy

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/test/clj/backtype/storm/security/auth/DefaultHttpCredentialsPlugin_test.clj
----------------------------------------------------------------------
diff --git a/storm-core/test/clj/backtype/storm/security/auth/DefaultHttpCredentialsPlugin_test.clj b/storm-core/test/clj/backtype/storm/security/auth/DefaultHttpCredentialsPlugin_test.clj
index 6e214a2..e7b44cf 100644
--- a/storm-core/test/clj/backtype/storm/security/auth/DefaultHttpCredentialsPlugin_test.clj
+++ b/storm-core/test/clj/backtype/storm/security/auth/DefaultHttpCredentialsPlugin_test.clj
@@ -1,3 +1,18 @@
+;; Licensed to the Apache Software Foundation (ASF) under one
+;; or more contributor license agreements.  See the NOTICE file
+;; distributed with this work for additional information
+;; regarding copyright ownership.  The ASF licenses this file
+;; to you under the Apache License, Version 2.0 (the
+;; "License"); you may not use this file except in compliance
+;; with the License.  You may obtain a copy of the License at
+;;
+;; http://www.apache.org/licenses/LICENSE-2.0
+;;
+;; Unless required by applicable law or agreed to in writing, software
+;; distributed under the License is distributed on an "AS IS" BASIS,
+;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+;; See the License for the specific language governing permissions and
+;; limitations under the License.
 (ns backtype.storm.security.auth.DefaultHttpCredentialsPlugin-test
   (:use [clojure test])
   (:import [javax.security.auth Subject])

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/test/clj/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer_test.clj
----------------------------------------------------------------------
diff --git a/storm-core/test/clj/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer_test.clj b/storm-core/test/clj/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer_test.clj
index c70fa2a..6768210 100644
--- a/storm-core/test/clj/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer_test.clj
+++ b/storm-core/test/clj/backtype/storm/security/auth/authorizer/DRPCSimpleACLAuthorizer_test.clj
@@ -1,3 +1,18 @@
+;; Licensed to the Apache Software Foundation (ASF) under one
+;; or more contributor license agreements.  See the NOTICE file
+;; distributed with this work for additional information
+;; regarding copyright ownership.  The ASF licenses this file
+;; to you under the Apache License, Version 2.0 (the
+;; "License"); you may not use this file except in compliance
+;; with the License.  You may obtain a copy of the License at
+;;
+;; http://www.apache.org/licenses/LICENSE-2.0
+;;
+;; Unless required by applicable law or agreed to in writing, software
+;; distributed under the License is distributed on an "AS IS" BASIS,
+;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+;; See the License for the specific language governing permissions and
+;; limitations under the License.
 (ns backtype.storm.security.auth.authorizer.DRPCSimpleACLAuthorizer-test
   (:use [clojure test])
   (:import [org.mockito Mockito])

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-alice.jaas
----------------------------------------------------------------------
diff --git a/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-alice.jaas b/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-alice.jaas
index cd691ae..0e68056 100644
--- a/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-alice.jaas
+++ b/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-alice.jaas
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 StormClient {
        org.apache.zookeeper.server.auth.DigestLoginModule required
        username="alice"

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-bob.jaas
----------------------------------------------------------------------
diff --git a/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-bob.jaas b/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-bob.jaas
index e4ca097..bea9a0c 100644
--- a/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-bob.jaas
+++ b/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-bob.jaas
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 StormClient {
        org.apache.zookeeper.server.auth.DigestLoginModule required
        username="bob"

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-charlie.jaas
----------------------------------------------------------------------
diff --git a/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-charlie.jaas b/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-charlie.jaas
index 3473d6d..31a2505 100644
--- a/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-charlie.jaas
+++ b/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-charlie.jaas
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 StormClient {
        org.apache.zookeeper.server.auth.DigestLoginModule required
        username="charlie"

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-server.jaas
----------------------------------------------------------------------
diff --git a/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-server.jaas b/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-server.jaas
index 3b22d21..8ba9fc4 100644
--- a/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-server.jaas
+++ b/storm-core/test/clj/backtype/storm/security/auth/drpc-auth-server.jaas
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 StormServer {
        org.apache.zookeeper.server.auth.DigestLoginModule required
        user_alice="poorpasswordforalice"


[39/50] [abbrv] storm git commit: fix docs for hbase example

Posted by pt...@apache.org.
fix docs for hbase example


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/b90ec781
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/b90ec781
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/b90ec781

Branch: refs/heads/0.10.x-branch
Commit: b90ec781c7a7ba77f6598fe20e7842fa4a228098
Parents: 91369ac
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue Jun 2 17:14:34 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue Jun 2 17:14:34 2015 -0400

----------------------------------------------------------------------
 external/flux/flux-examples/README.md | 10 ++++------
 1 file changed, 4 insertions(+), 6 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/b90ec781/external/flux/flux-examples/README.md
----------------------------------------------------------------------
diff --git a/external/flux/flux-examples/README.md b/external/flux/flux-examples/README.md
index fceebd8..0a7085e 100644
--- a/external/flux/flux-examples/README.md
+++ b/external/flux/flux-examples/README.md
@@ -58,11 +58,9 @@ storm jar ./target/flux-examples-*.jar org.apache.storm.flux.Flux --local ./src/
 
 This example illustrates how to use Flux to setup a storm-hbase bolt to write to HBase.
 
-In order to use this example, you will need to edit the `src/main resrouces/hbase-site.xml` file to reflect your HBase
-environment, and then rebuild the topology jar.
-
-You can do so by running the following Maven command in the `flux-examples` directory:
+To run the `simple_hbase.yaml` example, copy the `hbase_bolt.properties` file to a convenient location and change the properties
+ `hbase.rootdir` and `hbase.zookeeper.quorum`. Then you can run the example something like:
 
 ```bash
-mvn clean install
-```
\ No newline at end of file
+storm jar ./target/flux-examples-*.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_hbase.yaml --filter my_hbase_bolt.properties
+```


[37/50] [abbrv] storm git commit: STORM-842: Drop Support for Java 1.6

Posted by pt...@apache.org.
STORM-842: Drop Support for Java 1.6


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/fc736002
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/fc736002
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/fc736002

Branch: refs/heads/0.10.x-branch
Commit: fc73600228c156a68327a342bf2c2da514620bbb
Parents: ad98824
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Mon Jun 1 17:15:06 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Mon Jun 1 17:15:06 2015 -0400

----------------------------------------------------------------------
 pom.xml | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/fc736002/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index bfb9069..4729554 100644
--- a/pom.xml
+++ b/pom.xml
@@ -658,8 +658,8 @@
                 <groupId>org.apache.maven.plugins</groupId>
                 <artifactId>maven-compiler-plugin</artifactId>
                 <configuration>
-                    <source>1.6</source>
-                    <target>1.6</target>
+                    <source>1.7</source>
+                    <target>1.7</target>
                 </configuration>
             </plugin>
             <plugin>


[24/50] [abbrv] storm git commit: merge flux into external/flux/

Posted by pt...@apache.org.
http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
index 0000000,0000000..9456d1b
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
@@@ -1,0 -1,0 +1,234 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux;
++
++import backtype.storm.Config;
++import backtype.storm.generated.StormTopology;
++import org.apache.storm.flux.model.ExecutionContext;
++import org.apache.storm.flux.model.TopologyDef;
++import org.apache.storm.flux.parser.FluxParser;
++import org.apache.storm.flux.test.TestBolt;
++import org.junit.Test;
++
++import java.io.File;
++
++import static org.junit.Assert.*;
++
++public class TCKTest {
++    @Test
++    public void testTCK() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/tck.yaml", false, true, null, false);
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test
++    public void testShellComponents() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/shell_test.yaml", false, true, null, false);
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test
++    public void testKafkaSpoutConfig() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/kafka_test.yaml", false, true, null, false);
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test
++    public void testLoadFromResource() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/kafka_test.yaml", false, true, null, false);
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++
++    @Test
++    public void testHdfs() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/hdfs_test.yaml", false, true, null, false);
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test
++    public void testHbase() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/simple_hbase.yaml", false, true, null, false);
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test(expected = IllegalArgumentException.class)
++    public void testBadHbase() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/bad_hbase.yaml", false, true, null, false);
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test
++    public void testIncludes() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/include_test.yaml", false, true, null, false);
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        assertTrue(topologyDef.getName().equals("include-topology"));
++        assertTrue(topologyDef.getBolts().size() > 0);
++        assertTrue(topologyDef.getSpouts().size() > 0);
++        topology.validate();
++    }
++
++    @Test
++    public void testTopologySource() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/existing-topology.yaml", false, true, null, false);
++        assertTrue(topologyDef.validate());
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test
++    public void testTopologySourceWithReflection() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/existing-topology-reflection.yaml", false, true, null, false);
++        assertTrue(topologyDef.validate());
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test
++    public void testTopologySourceWithConfigParam() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/existing-topology-reflection-config.yaml", false, true, null, false);
++        assertTrue(topologyDef.validate());
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test
++    public void testTopologySourceWithMethodName() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/existing-topology-method-override.yaml", false, true, null, false);
++        assertTrue(topologyDef.validate());
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++
++    @Test
++    public void testTridentTopologySource() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/existing-topology-trident.yaml", false, true, null, false);
++        assertTrue(topologyDef.validate());
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test(expected = IllegalArgumentException.class)
++    public void testInvalidTopologySource() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/invalid-existing-topology.yaml", false, true, null, false);
++        assertFalse("Topology config is invalid.", topologyDef.validate());
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++    }
++
++
++    @Test
++    public void testTopologySourceWithGetMethodName() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/existing-topology-reflection.yaml", false, true, null, false);
++        assertTrue(topologyDef.validate());
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++    }
++
++    @Test
++    public void testTopologySourceWithConfigMethods() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/config-methods-test.yaml", false, true, null, false);
++        assertTrue(topologyDef.validate());
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++
++        // make sure the property was actually set
++        TestBolt bolt = (TestBolt)context.getBolt("bolt-1");
++        assertTrue(bolt.getFoo().equals("foo"));
++        assertTrue(bolt.getBar().equals("bar"));
++        assertTrue(bolt.getFooBar().equals("foobar"));
++    }
++
++    @Test
++    public void testVariableSubstitution() throws Exception {
++        TopologyDef topologyDef = FluxParser.parseResource("/configs/substitution-test.yaml", false, true, "src/test/resources/configs/test.properties", true);
++        assertTrue(topologyDef.validate());
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++        assertNotNull(topology);
++        topology.validate();
++
++        // test basic substitution
++        assertEquals("Property not replaced.",
++                "substitution-topology",
++                context.getTopologyDef().getName());
++
++        // test environment variable substitution
++        // $PATH should be defined on most systems
++        String envPath = System.getenv().get("PATH");
++        assertEquals("ENV variable not replaced.",
++                envPath,
++                context.getTopologyDef().getConfig().get("test.env.value"));
++
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/java/org/apache/storm/flux/multilang/MultilangEnvirontmentTest.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/java/org/apache/storm/flux/multilang/MultilangEnvirontmentTest.java
index 0000000,0000000..dcded17
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/multilang/MultilangEnvirontmentTest.java
@@@ -1,0 -1,0 +1,89 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.multilang;
++
++
++import org.junit.Test;
++import org.slf4j.Logger;
++import org.slf4j.LoggerFactory;
++
++import java.io.ByteArrayOutputStream;
++import java.io.InputStream;
++import java.io.OutputStream;
++
++import static org.junit.Assert.assertEquals;
++
++/**
++ * Sanity checks to make sure we can at least invoke the shells used.
++ */
++public class MultilangEnvirontmentTest {
++    private static final Logger LOG = LoggerFactory.getLogger(MultilangEnvirontmentTest.class);
++
++    @Test
++    public void testInvokePython() throws Exception {
++        String[] command = new String[]{"python", "--version"};
++        int exitVal = invokeCommand(command);
++        assertEquals("Exit value for python is 0.", 0, exitVal);
++    }
++
++    @Test
++    public void testInvokeNode() throws Exception {
++        String[] command = new String[]{"node", "--version"};
++        int exitVal = invokeCommand(command);
++        assertEquals("Exit value for node is 0.", 0, exitVal);
++    }
++
++    private static class StreamRedirect implements Runnable {
++        private InputStream in;
++        private OutputStream out;
++
++        public StreamRedirect(InputStream in, OutputStream out) {
++            this.in = in;
++            this.out = out;
++        }
++
++        @Override
++        public void run() {
++            try {
++                int i = -1;
++                while ((i = this.in.read()) != -1) {
++                    out.write(i);
++                }
++                this.in.close();
++                this.out.close();
++            } catch (Exception e) {
++                e.printStackTrace();
++            }
++        }
++    }
++
++    private int invokeCommand(String[] args) throws Exception {
++        LOG.debug("Invoking command: {}", args);
++
++        ProcessBuilder pb = new ProcessBuilder(args);
++        pb.redirectErrorStream(true);
++        final Process proc = pb.start();
++
++        ByteArrayOutputStream out = new ByteArrayOutputStream();
++        Thread t = new Thread(new StreamRedirect(proc.getInputStream(), out));
++        t.start();
++        int exitVal = proc.waitFor();
++        LOG.debug("Command result: {}", out.toString());
++        return exitVal;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopology.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopology.java
index 0000000,0000000..0d37997
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopology.java
@@@ -1,0 -1,0 +1,42 @@@
++package org.apache.storm.flux.test;
++
++import backtype.storm.generated.StormTopology;
++import backtype.storm.topology.TopologyBuilder;
++import org.apache.storm.flux.api.TopologySource;
++import org.apache.storm.flux.wrappers.bolts.LogInfoBolt;
++import org.apache.storm.flux.wrappers.spouts.FluxShellSpout;
++
++import java.util.Map;
++
++/**
++ * Test topology source that does not implement TopologySource, but has the same
++ * `getTopology()` method.
++ */
++public class SimpleTopology{
++
++
++    public SimpleTopology(){}
++
++    public SimpleTopology(String foo, String bar){}
++
++    public StormTopology getTopologyWithDifferentMethodName(Map<String, Object> config){
++        return getTopology(config);
++    }
++
++
++    public StormTopology getTopology(Map<String, Object> config) {
++        TopologyBuilder builder = new TopologyBuilder();
++
++        // spouts
++        FluxShellSpout spout = new FluxShellSpout(
++                new String[]{"node", "randomsentence.js"},
++                new String[]{"word"});
++        builder.setSpout("sentence-spout", spout, 1);
++
++        // bolts
++        builder.setBolt("log-bolt", new LogInfoBolt(), 1)
++                .shuffleGrouping("sentence-spout");
++
++        return builder.createTopology();
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologySource.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologySource.java
index 0000000,0000000..2007082
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologySource.java
@@@ -1,0 -1,0 +1,35 @@@
++package org.apache.storm.flux.test;
++
++import backtype.storm.generated.StormTopology;
++import backtype.storm.topology.TopologyBuilder;
++import org.apache.storm.flux.api.TopologySource;
++import org.apache.storm.flux.wrappers.bolts.LogInfoBolt;
++import org.apache.storm.flux.wrappers.spouts.FluxShellSpout;
++
++import java.util.Map;
++
++public class SimpleTopologySource implements TopologySource {
++
++
++    public SimpleTopologySource(){}
++
++    public SimpleTopologySource(String foo, String bar){}
++
++
++    @Override
++    public StormTopology getTopology(Map<String, Object> config) {
++        TopologyBuilder builder = new TopologyBuilder();
++
++        // spouts
++        FluxShellSpout spout = new FluxShellSpout(
++                new String[]{"node", "randomsentence.js"},
++                new String[]{"word"});
++        builder.setSpout("sentence-spout", spout, 1);
++
++        // bolts
++        builder.setBolt("log-bolt", new LogInfoBolt(), 1)
++                .shuffleGrouping("sentence-spout");
++
++        return builder.createTopology();
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologyWithConfigParam.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologyWithConfigParam.java
index 0000000,0000000..f29b543
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologyWithConfigParam.java
@@@ -1,0 -1,0 +1,38 @@@
++package org.apache.storm.flux.test;
++
++import backtype.storm.Config;
++import backtype.storm.generated.StormTopology;
++import backtype.storm.topology.TopologyBuilder;
++import org.apache.storm.flux.wrappers.bolts.LogInfoBolt;
++import org.apache.storm.flux.wrappers.spouts.FluxShellSpout;
++
++import java.util.Map;
++
++/**
++ * Test topology source that does not implement TopologySource, but has the same
++ * `getTopology()` method.
++ */
++public class SimpleTopologyWithConfigParam {
++
++
++    public SimpleTopologyWithConfigParam(){}
++
++    public SimpleTopologyWithConfigParam(String foo, String bar){}
++
++
++    public StormTopology getTopology(Config config) {
++        TopologyBuilder builder = new TopologyBuilder();
++
++        // spouts
++        FluxShellSpout spout = new FluxShellSpout(
++                new String[]{"node", "randomsentence.js"},
++                new String[]{"word"});
++        builder.setSpout("sentence-spout", spout, 1);
++
++        // bolts
++        builder.setBolt("log-bolt", new LogInfoBolt(), 1)
++                .shuffleGrouping("sentence-spout");
++
++        return builder.createTopology();
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TestBolt.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TestBolt.java
index 0000000,0000000..e88f2cf
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TestBolt.java
@@@ -1,0 -1,0 +1,63 @@@
++package org.apache.storm.flux.test;
++
++import backtype.storm.topology.BasicOutputCollector;
++import backtype.storm.topology.OutputFieldsDeclarer;
++import backtype.storm.topology.base.BaseBasicBolt;
++import backtype.storm.tuple.Tuple;
++import org.slf4j.Logger;
++import org.slf4j.LoggerFactory;
++
++
++public class TestBolt extends BaseBasicBolt {
++    private static final Logger LOG = LoggerFactory.getLogger(TestBolt.class);
++
++    private String foo;
++    private String bar;
++    private String fooBar;
++
++    public static enum TestEnum {
++        FOO,
++        BAR
++    }
++
++    public TestBolt(TestEnum te){
++
++    }
++
++    public TestBolt(TestEnum te, float f){
++
++    }
++
++    @Override
++    public void execute(Tuple tuple, BasicOutputCollector basicOutputCollector) {
++        LOG.info("{}", tuple);
++    }
++
++    @Override
++    public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
++
++    }
++
++    // config methods
++    public void withFoo(String foo){
++        this.foo = foo;
++    }
++    public void withBar(String bar){
++        this.bar = bar;
++    }
++
++    public void withFooBar(String foo, String bar){
++        this.fooBar = foo + bar;
++    }
++
++    public String getFoo(){
++        return this.foo;
++    }
++    public String getBar(){
++        return this.bar;
++    }
++
++    public String getFooBar(){
++        return this.fooBar;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TridentTopologySource.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TridentTopologySource.java
index 0000000,0000000..3cb6634
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TridentTopologySource.java
@@@ -1,0 -1,0 +1,54 @@@
++package org.apache.storm.flux.test;
++
++import backtype.storm.Config;
++import backtype.storm.generated.StormTopology;
++import backtype.storm.tuple.Fields;
++import backtype.storm.tuple.Values;
++import storm.kafka.StringScheme;
++import storm.trident.TridentTopology;
++import storm.trident.operation.BaseFunction;
++import storm.trident.operation.TridentCollector;
++import storm.trident.operation.builtin.Count;
++import storm.trident.testing.FixedBatchSpout;
++import storm.trident.testing.MemoryMapState;
++import storm.trident.tuple.TridentTuple;
++
++/**
++ * Basic Trident example that will return a `StormTopology` from a `getTopology()` method.
++ */
++public class TridentTopologySource {
++
++    private FixedBatchSpout spout;
++
++    public StormTopology getTopology(Config config) {
++
++        this.spout = new FixedBatchSpout(new Fields("sentence"), 20,
++                new Values("one two"),
++                new Values("two three"),
++                new Values("three four"),
++                new Values("four five"),
++                new Values("five six")
++        );
++
++
++        TridentTopology trident = new TridentTopology();
++
++        trident.newStream("wordcount", spout).name("sentence").parallelismHint(1).shuffle()
++                .each(new Fields("sentence"), new Split(), new Fields("word"))
++                .parallelismHint(1)
++                .groupBy(new Fields("word"))
++                .persistentAggregate(new MemoryMapState.Factory(), new Count(), new Fields("count"))
++                .parallelismHint(1);
++        return trident.build();
++    }
++
++    public static class Split extends BaseFunction {
++        @Override
++        public void execute(TridentTuple tuple, TridentCollector collector) {
++            String sentence = tuple.getString(0);
++            for (String word : sentence.split(" ")) {
++                collector.emit(new Values(word));
++            }
++        }
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/bad_hbase.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/bad_hbase.yaml
index 0000000,0000000..5d91400
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/bad_hbase.yaml
@@@ -1,0 -1,0 +1,98 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++# Test ability to wire together shell spouts/bolts
++---
++
++# topology definition
++# name to be used when submitting
++name: "hbase-wordcount"
++
++# Components
++# Components are analagous to Spring beans. They are meant to be used as constructor,
++# property(setter), and builder arguments.
++#
++# for the time being, components must be declared in the order they are referenced
++
++components:
++  - id: "columnFields"
++    className: "backtype.storm.tuple.Fields"
++    constructorArgs:
++      - ["word"]
++
++  - id: "counterFields"
++    className: "backtype.storm.tuple.Fields"
++    constructorArgs:
++      # !!! the following won't work, and should thow an IllegalArgumentException...
++      - "count"
++
++  - id: "mapper"
++    className: "org.apache.storm.hbase.bolt.mapper.SimpleHBaseMapper"
++    configMethods:
++      - name: "withRowKeyField"
++        args: ["word"]
++      - name: "withColumnFields"
++        args: [ref: "columnFields"]
++      - name: "withCounterFields"
++        args: [ref: "counterFields"]
++      - name: "withColumnFamily"
++        args: ["cf"]
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++#
++config:
++  topology.workers: 1
++  hbase.conf:
++    hbase.rootdir: "hdfs://hadoop:54310/hbase"
++    hbase.zookeeper.quorum: "hadoop"
++
++# spout definitions
++spouts:
++  - id: "word-spout"
++    className: "backtype.storm.testing.TestWordSpout"
++    parallelism: 1
++
++# bolt definitions
++
++bolts:
++  - id: "count-bolt"
++    className: "backtype.storm.testing.TestWordCounter"
++
++  - id: "hbase-bolt"
++    className: "org.apache.storm.hbase.bolt.HBaseBolt"
++    constructorArgs:
++      - "WordCount" # HBase table name
++      - ref: "mapper"
++    configMethods:
++      - name: "withConfigKey"
++        args: ["hbase.conf"]
++    parallelism: 1
++
++
++streams:
++  - name: "" # name isn't used (placeholder for logging, UI, etc.)
++    from: "word-spout"
++    to: "count-bolt"
++    grouping:
++      type: SHUFFLE
++
++  - name: "" # name isn't used (placeholder for logging, UI, etc.)
++    from: "count-bolt"
++    to: "hbase-bolt"
++    grouping:
++      type: FIELDS
++      args: ["word"]

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/config-methods-test.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/config-methods-test.yaml
index 0000000,0000000..65211ff
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/config-methods-test.yaml
@@@ -1,0 -1,0 +1,70 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++---
++name: "yaml-topology"
++
++#
++config:
++  topology.workers: 1
++  # ...
++
++# spout definitions
++spouts:
++  - id: "spout-1"
++    className: "backtype.storm.testing.TestWordSpout"
++    parallelism: 1
++    # ...
++
++# bolt definitions
++bolts:
++  - id: "bolt-1"
++    className: "org.apache.storm.flux.test.TestBolt"
++    parallelism: 1
++    constructorArgs:
++      - FOO # enum class
++      - 1.0
++    configMethods:
++      - name: "withFoo"
++        args:
++          - "foo"
++      - name: "withBar"
++        args:
++          - "bar"
++      - name: "withFooBar"
++        args:
++          - "foo"
++          - "bar"
++
++
++
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++streams:
++  - name: "spout-1 --> bolt-1" # name isn't used (placeholder for logging, UI, etc.)
++#    id: "connection-1"
++    from: "spout-1"
++    to: "bolt-1"
++    grouping:
++      type: SHUFFLE
++
++
++
++
++
++
++
++

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/existing-topology-method-override.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/existing-topology-method-override.yaml
index 0000000,0000000..6f3c88a
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology-method-override.yaml
@@@ -1,0 -1,0 +1,10 @@@
++---
++
++# configuration that uses an existing topology that does not implement TopologySource
++name: "existing-topology"
++topologySource:
++  className: "org.apache.storm.flux.test.SimpleTopology"
++  methodName: "getTopologyWithDifferentMethodName"
++  constructorArgs:
++    - "foo"
++    - "bar"

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection-config.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/existing-topology-reflection-config.yaml
index 0000000,0000000..8af8a84
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection-config.yaml
@@@ -1,0 -1,0 +1,9 @@@
++---
++
++# configuration that uses an existing topology that does not implement TopologySource
++name: "existing-topology"
++topologySource:
++  className: "org.apache.storm.flux.test.SimpleTopologyWithConfigParam"
++  constructorArgs:
++    - "foo"
++    - "bar"

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/existing-topology-reflection.yaml
index 0000000,0000000..dd3e1e8
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection.yaml
@@@ -1,0 -1,0 +1,9 @@@
++---
++
++# configuration that uses an existing topology that does not implement TopologySource
++name: "existing-topology"
++topologySource:
++  className: "org.apache.storm.flux.test.SimpleTopology"
++  constructorArgs:
++    - "foo"
++    - "bar"

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/existing-topology-trident.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/existing-topology-trident.yaml
index 0000000,0000000..5ac682c
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology-trident.yaml
@@@ -1,0 -1,0 +1,9 @@@
++---
++
++name: "existing-topology"
++
++config:
++  topology.workers: 1
++
++topologySource:
++  className: "org.apache.storm.flux.test.TridentTopologySource"

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/existing-topology.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/existing-topology.yaml
index 0000000,0000000..fa6a0b3
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology.yaml
@@@ -1,0 -1,0 +1,8 @@@
++---
++
++name: "existing-topology"
++topologySource:
++  className: "org.apache.storm.flux.test.SimpleTopologySource"
++  constructorArgs:
++    - "foo"
++    - "bar"

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/hdfs_test.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/hdfs_test.yaml
index 0000000,0000000..8fe0a9a
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/hdfs_test.yaml
@@@ -1,0 -1,0 +1,97 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++# Test ability to wire together shell spouts/bolts
++---
++
++# topology definition
++# name to be used when submitting
++name: "hdfs-topology"
++
++# Components
++# Components are analagous to Spring beans. They are meant to be used as constructor,
++# property(setter), and builder arguments.
++#
++# for the time being, components must be declared in the order they are referenced
++components:
++  - id: "syncPolicy"
++    className: "org.apache.storm.hdfs.bolt.sync.CountSyncPolicy"
++    constructorArgs:
++      - 1000
++  - id: "rotationPolicy"
++    className: "org.apache.storm.hdfs.bolt.rotation.FileSizeRotationPolicy"
++    constructorArgs:
++      - 5.0
++      - MB
++
++  - id: "fileNameFormat"
++    className: "org.apache.storm.hdfs.bolt.format.DefaultFileNameFormat"
++    configMethods:
++      - name: "withPath"
++        args: ["/tmp/foo/"]
++      - name: "withExtension"
++        args: [".txt"]
++
++  - id: "recordFormat"
++    className: "org.apache.storm.hdfs.bolt.format.DelimitedRecordFormat"
++    configMethods:
++      - name: "withFieldDelimiter"
++        args: ["|"]
++
++  - id: "rotationAction"
++    className: "org.apache.storm.hdfs.common.rotation.MoveFileAction"
++    configMethods:
++      - name: "toDestination"
++        args: ["/tmp/dest2"]
++
++# spout definitions
++spouts:
++  - id: "spout-1"
++    className: "backtype.storm.testing.TestWordSpout"
++    parallelism: 1
++    # ...
++
++# bolt definitions
++
++#        HdfsBolt bolt = new HdfsBolt()
++#                .withConfigKey("hdfs.config")
++#                .withFsUrl(args[0])
++#                .withFileNameFormat(fileNameFormat)
++#                .withRecordFormat(format)
++#                .withRotationPolicy(rotationPolicy)
++#                .withSyncPolicy(syncPolicy)
++#                .addRotationAction(new MoveFileAction().toDestination("/tmp/dest2/"));
++bolts:
++  - id: "bolt-1"
++    className: "org.apache.storm.hdfs.bolt.HdfsBolt"
++    configMethods:
++      - name: "withConfigKey"
++        args: ["hdfs.config"]
++      - name: "withFsUrl"
++        args: ["hdfs://hadoop:54310"]
++      - name: "withFileNameFormat"
++        args: [ref: "fileNameFormat"]
++      - name: "withRecordFormat"
++        args: [ref: "recordFormat"]
++      - name: "withRotationPolicy"
++        args: [ref: "rotationPolicy"]
++      - name: "withSyncPolicy"
++        args: [ref: "syncPolicy"]
++      - name: "addRotationAction"
++        args: [ref: "rotationAction"]
++    parallelism: 1
++    # ...
++

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/include_test.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/include_test.yaml
index 0000000,0000000..702f590
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/include_test.yaml
@@@ -1,0 -1,0 +1,25 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++# Test includes by defining nothing, and simply override the topology name
++---
++
++name: "include-topology"
++
++includes:
++  - resource: true
++    file: "/configs/shell_test.yaml"
++    override: false #otherwise subsequent includes that define 'name' would override

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/invalid-existing-topology.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/invalid-existing-topology.yaml
index 0000000,0000000..72128df
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/invalid-existing-topology.yaml
@@@ -1,0 -1,0 +1,17 @@@
++# This is an invalid config. It defines both a topologySource and a list of spouts.
++---
++
++name: "existing-topology"
++topologySource:
++  className: "org.apache.storm.flux.test.SimpleTopologySource"
++
++spouts:
++  - id: "sentence-spout"
++    className: "org.apache.storm.flux.wrappers.spouts.FluxShellSpout"
++    # shell spout constructor takes 2 arguments: String[], String[]
++    constructorArgs:
++      # command line
++      - ["node", "randomsentence.js"]
++      # output fields
++      - ["word"]
++    parallelism: 1

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/kafka_test.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/kafka_test.yaml
index 0000000,0000000..17cd8e2
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/kafka_test.yaml
@@@ -1,0 -1,0 +1,126 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++---
++
++# topology definition
++# name to be used when submitting
++name: "kafka-topology"
++
++# Components
++# Components are analagous to Spring beans. They are meant to be used as constructor,
++# property(setter), and builder arguments.
++#
++# for the time being, components must be declared in the order they are referenced
++components:
++  - id: "stringScheme"
++    className: "storm.kafka.StringScheme"
++
++  - id: "stringMultiScheme"
++    className: "backtype.storm.spout.SchemeAsMultiScheme"
++    constructorArgs:
++      - ref: "stringScheme"
++
++  - id: "zkHosts"
++    className: "storm.kafka.ZkHosts"
++    constructorArgs:
++      - "localhost:2181"
++
++# Alternative kafka config
++#  - id: "kafkaConfig"
++#    className: "storm.kafka.KafkaConfig"
++#    constructorArgs:
++#      # brokerHosts
++#      - ref: "zkHosts"
++#      # topic
++#      - "myKafkaTopic"
++#      # clientId (optional)
++#      - "myKafkaClientId"
++
++  - id: "spoutConfig"
++    className: "storm.kafka.SpoutConfig"
++    constructorArgs:
++      # brokerHosts
++      - ref: "zkHosts"
++      # topic
++      - "myKafkaTopic"
++      # zkRoot
++      - "/kafkaSpout"
++      # id
++      - "myId"
++    properties:
++      - name: "forceFromStart"
++        value: true
++      - name: "scheme"
++        ref: "stringMultiScheme"
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++#
++config:
++  topology.workers: 1
++  # ...
++
++# spout definitions
++spouts:
++  - id: "kafka-spout"
++    className: "storm.kafka.KafkaSpout"
++    constructorArgs:
++      - ref: "spoutConfig"
++
++# bolt definitions
++bolts:
++  - id: "splitsentence"
++    className: "org.apache.storm.flux.wrappers.bolts.FluxShellBolt"
++    constructorArgs:
++      # command line
++      - ["python", "splitsentence.py"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++
++  - id: "log"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++    # ...
++
++  - id: "count"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++# custom stream groupings are also supported
++
++streams:
++  - name: "kafka --> split" # name isn't used (placeholder for logging, UI, etc.)
++    from: "kafka-spout"
++    to: "splitsentence"
++    grouping:
++      type: SHUFFLE
++
++  - name: "split --> count"
++    from: "splitsentence"
++    to: "count"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "count --> log"
++    from: "count"
++    to: "log"
++    grouping:
++      type: SHUFFLE

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/shell_test.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/shell_test.yaml
index 0000000,0000000..b473fa7
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/shell_test.yaml
@@@ -1,0 -1,0 +1,104 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++# Test ability to wire together shell spouts/bolts
++---
++
++# topology definition
++# name to be used when submitting
++name: "shell-topology"
++
++# Components
++# Components are analagous to Spring beans. They are meant to be used as constructor,
++# property(setter), and builder arguments.
++#components:
++#  - id: "myComponent"
++#    className: "com.foo.bar.MyComponent"
++#    constructorArgs:
++#      - ...
++#    properties:
++#      foo: "bar"
++#      bar: "foo"
++
++# NOTE: We may want to consider some level of spring integration. For example, allowing component references
++# to a spring `ApplicationContext`.
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++#
++config:
++  topology.workers: 1
++  # ...
++
++# spout definitions
++spouts:
++  - id: "sentence-spout"
++    className: "org.apache.storm.flux.wrappers.spouts.FluxShellSpout"
++    # shell spout constructor takes 2 arguments: String[], String[]
++    constructorArgs:
++      # command line
++      - ["node", "randomsentence.js"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++    # ...
++
++# bolt definitions
++bolts:
++  - id: "splitsentence"
++    className: "org.apache.storm.flux.wrappers.bolts.FluxShellBolt"
++    constructorArgs:
++      # command line
++      - ["python", "splitsentence.py"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++    # ...
++
++  - id: "log"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++    # ...
++
++  - id: "count"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++    # ...
++
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++# custom stream groupings are also supported
++
++streams:
++  - name: "spout --> split" # name isn't used (placeholder for logging, UI, etc.)
++    from: "sentence-spout"
++    to: "splitsentence"
++    grouping:
++      type: SHUFFLE
++
++  - name: "split --> count"
++    from: "splitsentence"
++    to: "count"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "count --> log"
++    from: "count"
++    to: "log"
++    grouping:
++      type: SHUFFLE

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/simple_hbase.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/simple_hbase.yaml
index 0000000,0000000..e407bd9
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/simple_hbase.yaml
@@@ -1,0 -1,0 +1,120 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++# Test ability to wire together shell spouts/bolts
++---
++
++# topology definition
++# name to be used when submitting
++name: "hbase-wordcount"
++
++# Components
++# Components are analagous to Spring beans. They are meant to be used as constructor,
++# property(setter), and builder arguments.
++#
++# for the time being, components must be declared in the order they are referenced
++
++#        WordSpout spout = new WordSpout();
++#        WordCounter bolt = new WordCounter();
++#
++#        SimpleHBaseMapper mapper = new SimpleHBaseMapper()
++#                .withRowKeyField("word")
++#                .withColumnFields(new Fields("word"))
++#                .withCounterFields(new Fields("count"))
++#                .withColumnFamily("cf");
++#
++#        HBaseBolt hbase = new HBaseBolt("WordCount", mapper)
++#                .withConfigKey("hbase.conf");
++#
++#
++#        // wordSpout ==> countBolt ==> HBaseBolt
++#        TopologyBuilder builder = new TopologyBuilder();
++#
++#        builder.setSpout(WORD_SPOUT, spout, 1);
++#        builder.setBolt(COUNT_BOLT, bolt, 1).shuffleGrouping(WORD_SPOUT);
++#        builder.setBolt(HBASE_BOLT, hbase, 1).fieldsGrouping(COUNT_BOLT, new Fields("word"));
++
++
++
++
++components:
++  - id: "columnFields"
++    className: "backtype.storm.tuple.Fields"
++    constructorArgs:
++      - ["word"]
++
++  - id: "counterFields"
++    className: "backtype.storm.tuple.Fields"
++    constructorArgs:
++      - ["count"]
++
++  - id: "mapper"
++    className: "org.apache.storm.hbase.bolt.mapper.SimpleHBaseMapper"
++    configMethods:
++      - name: "withRowKeyField"
++        args: ["word"]
++      - name: "withColumnFields"
++        args: [ref: "columnFields"]
++      - name: "withCounterFields"
++        args: [ref: "counterFields"]
++      - name: "withColumnFamily"
++        args: ["cf"]
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++#
++config:
++  topology.workers: 1
++  hbase.conf:
++    hbase.rootdir: "hdfs://hadoop:54310/hbase"
++    hbase.zookeeper.quorum: "hadoop"
++
++# spout definitions
++spouts:
++  - id: "word-spout"
++    className: "backtype.storm.testing.TestWordSpout"
++    parallelism: 1
++
++# bolt definitions
++
++bolts:
++  - id: "count-bolt"
++    className: "backtype.storm.testing.TestWordCounter"
++
++  - id: "hbase-bolt"
++    className: "org.apache.storm.hbase.bolt.HBaseBolt"
++    constructorArgs:
++      - "WordCount" # HBase table name
++      - ref: "mapper"
++    configMethods:
++      - name: "withConfigKey"
++        args: ["hbase.conf"]
++    parallelism: 1
++
++
++streams:
++  - name: "" # name isn't used (placeholder for logging, UI, etc.)
++    from: "word-spout"
++    to: "count-bolt"
++    grouping:
++      type: SHUFFLE
++
++  - name: "" # name isn't used (placeholder for logging, UI, etc.)
++    from: "count-bolt"
++    to: "hbase-bolt"
++    grouping:
++      type: FIELDS
++      args: ["word"]

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/substitution-test.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/substitution-test.yaml
index 0000000,0000000..13f1960
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/substitution-test.yaml
@@@ -1,0 -1,0 +1,106 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++# Test ability to wire together shell spouts/bolts
++---
++
++# topology definition
++# name to be used when submitting
++name: "${topology.name}"
++
++# Components
++# Components are analagous to Spring beans. They are meant to be used as constructor,
++# property(setter), and builder arguments.
++#components:
++#  - id: "myComponent"
++#    className: "com.foo.bar.MyComponent"
++#    constructorArgs:
++#      - ...
++#    properties:
++#      foo: "bar"
++#      bar: "foo"
++
++# NOTE: We may want to consider some level of spring integration. For example, allowing component references
++# to a spring `ApplicationContext`.
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++#
++config:
++  topology.workers: 1
++  # test environent variable substitution
++  test.env.value: ${ENV-PATH}
++  # ...
++
++# spout definitions
++spouts:
++  - id: "sentence-spout"
++    className: "org.apache.storm.flux.wrappers.spouts.FluxShellSpout"
++    # shell spout constructor takes 2 arguments: String[], String[]
++    constructorArgs:
++      # command line
++      - ["node", "randomsentence.js"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++    # ...
++
++# bolt definitions
++bolts:
++  - id: "splitsentence"
++    className: "org.apache.storm.flux.wrappers.bolts.FluxShellBolt"
++    constructorArgs:
++      # command line
++      - ["python", "splitsentence.py"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++    # ...
++
++  - id: "log"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++    # ...
++
++  - id: "count"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++    # ...
++
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++# custom stream groupings are also supported
++
++streams:
++  - name: "spout --> split" # name isn't used (placeholder for logging, UI, etc.)
++    from: "sentence-spout"
++    to: "splitsentence"
++    grouping:
++      type: SHUFFLE
++
++  - name: "split --> count"
++    from: "splitsentence"
++    to: "count"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "count --> log"
++    from: "count"
++    to: "log"
++    grouping:
++      type: SHUFFLE

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/tck.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/tck.yaml
index 0000000,0000000..7e9b614
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/tck.yaml
@@@ -1,0 -1,0 +1,95 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++
++# YAML configuration to serve as a basic smoke test for what is supported.
++#
++# We should support comments, so if we've failed so far, things aren't good.
++
++# we shouldn't choke if we see a document separator...
++---
++
++# topology definition
++# name to be used when submitting
++name: "yaml-topology"
++
++# Components
++# Components are analagous to Spring beans. They are meant to be used as constructor,
++# property(setter), and builder arguments.
++#components:
++#  - id: "myComponent"
++#    className: "com.foo.bar.MyComponent"
++#    properties:
++#      foo: "bar"
++#      bar: "foo"
++
++# NOTE: We may want to consider some level of spring integration. For example, allowing component references
++# to a spring `ApplicationContext`.
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++#
++config:
++  topology.workers: 1
++  # ...
++
++# spout definitions
++spouts:
++  - id: "spout-1"
++    className: "backtype.storm.testing.TestWordSpout"
++    parallelism: 1
++    # ...
++
++# bolt definitions
++bolts:
++  - id: "bolt-1"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++    # ...
++
++  - id: "bolt-2"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++    # ...
++
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++streams:
++  - name: "spout-1 --> bolt-1" # name isn't used (placeholder for logging, UI, etc.)
++#    id: "connection-1"
++    from: "spout-1"
++    to: "bolt-1"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "bolt-1 --> bolt2"
++    from: "bolt-1"
++    to: "bolt-2"
++    grouping:
++      type: CUSTOM
++      customClass:
++        className: "backtype.storm.testing.NGrouping"
++        constructorArgs:
++          - 1
++
++
++
++
++
++
++

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/configs/test.properties
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/configs/test.properties
index 0000000,0000000..0730d5f
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/configs/test.properties
@@@ -1,0 -1,0 +1,2 @@@
++topology.name: substitution-topology
++some.other.property: foo bar

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/resources/logback.xml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/resources/logback.xml
index 0000000,0000000..1853b8a
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/resources/logback.xml
@@@ -1,0 -1,0 +1,30 @@@
++<?xml version="1.0"?>
++<!--
++ Licensed to the Apache Software Foundation (ASF) under one or more
++ contributor license agreements.  See the NOTICE file distributed with
++ this work for additional information regarding copyright ownership.
++ The ASF licenses this file to You under the Apache License, Version 2.0
++ (the "License"); you may not use this file except in compliance with
++ the License.  You may obtain a copy of the License at
++
++     http://www.apache.org/licenses/LICENSE-2.0
++
++ Unless required by applicable law or agreed to in writing, software
++ distributed under the License is distributed on an "AS IS" BASIS,
++ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ See the License for the specific language governing permissions and
++ limitations under the License.
++-->
++<configuration scan="true" scanPeriod="30 seconds">
++  <appender name="A1" class="ch.qos.logback.core.ConsoleAppender">
++    <encoder>
++      <pattern>%-4r [%t] %-5p %c - %m%n</pattern>
++    </encoder>
++  </appender>
++  <logger name="org.apache.storm.zookeeper" level="WARN"/>
++    <logger name="org.apache.storm.curator" level="WARN"/>
++    <logger name="org.apache.storm.flux" level="DEBUG"/>
++  <root level="DEBUG">
++    <appender-ref ref="A1"/>
++  </root>
++</configuration>

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/README.md
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/README.md
index 0000000,0000000..b3798a6
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/README.md
@@@ -1,0 -1,0 +1,68 @@@
++# Flux Examples
++A collection of examples illustrating various capabilities.
++
++## Building From Source and Running
++
++Checkout the projects source and perform a top level Maven build (i.e. from the `flux` directory):
++
++```bash
++git clone https://github.com/ptgoetz/flux.git
++cd flux
++mvn install
++```
++
++This will create a shaded (i.e. "fat" or "uber") jar in the `flux-examples/target` directory that can run/deployed with
++the `storm` command:
++
++```bash
++cd flux-examples
++storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_wordcount.yaml
++```
++
++The example YAML files are also packaged in the examples jar, so they can also be referenced with Flux's `--resource`
++command line switch:
++
++```bash
++storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local --resource /simple_wordcount.yaml
++```
++
++## Available Examples
++
++### [simple_wordcount.yaml](src/main/resources/simple_wordcount.yaml)
++
++This is a very basic wordcount example using Java spouts and bolts. It simply logs the running count of each word
++received.
++
++### [multilang.yaml](src/main/resources/multilang.yaml)
++
++Another wordcount example that uses a spout written in JavaScript (node.js), a bolt written in Python, and two bolts
++written in java.
++
++### [kafka_spout.yaml](src/main/resources/kafka_spout.yaml)
++This example illustrates how to configure Storm's `storm-kafka` spout using Flux YAML DSL `components`, `references`,
++and `constructor arguments` constructs.
++
++### [simple_hdfs.yaml](src/main/resources/simple_hdfs.yaml)
++
++This example demonstrates using Flux to setup a storm-hdfs bolt to write to an HDFS cluster. It also demonstrates Flux's
++variable substitution/filtering feature.
++
++To run the `simple_hdfs.yaml` example, copy the `hdfs_bolt.properties` file to a convenient location and change, at
++least, the property `hdfs.url` to point to a HDFS cluster. Then you can run the example something like:
++
++```bash
++storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_hdfs.yaml --filter my_hdfs_bolt.properties
++```
++
++### [simple_hbase.yaml](src/main/resources/simple_hbase.yaml)
++
++This example illustrates how to use Flux to setup a storm-hbase bolt to write to HBase.
++
++In order to use this example, you will need to edit the `src/main resrouces/hbase-site.xml` file to reflect your HBase
++environment, and then rebuild the topology jar.
++
++You can do so by running the following Maven command in the `flux-examples` directory:
++
++```bash
++mvn clean install
++```

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/pom.xml
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/pom.xml
index 0000000,0000000..0b9796e
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/pom.xml
@@@ -1,0 -1,0 +1,87 @@@
++<?xml version="1.0" encoding="UTF-8"?>
++<!--
++ Licensed to the Apache Software Foundation (ASF) under one or more
++ contributor license agreements.  See the NOTICE file distributed with
++ this work for additional information regarding copyright ownership.
++ The ASF licenses this file to You under the Apache License, Version 2.0
++ (the "License"); you may not use this file except in compliance with
++ the License.  You may obtain a copy of the License at
++
++     http://www.apache.org/licenses/LICENSE-2.0
++
++ Unless required by applicable law or agreed to in writing, software
++ distributed under the License is distributed on an "AS IS" BASIS,
++ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ See the License for the specific language governing permissions and
++ limitations under the License.
++-->
++<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
++    <modelVersion>4.0.0</modelVersion>
++
++    <parent>
++        <groupId>com.github.ptgoetz</groupId>
++        <artifactId>flux</artifactId>
++        <version>0.3.1-SNAPSHOT</version>
++        <relativePath>../pom.xml</relativePath>
++    </parent>
++
++    <groupId>com.github.ptgoetz</groupId>
++    <artifactId>flux-examples</artifactId>
++    <packaging>jar</packaging>
++
++    <name>flux-examples</name>
++    <url>https://github.com/ptgoetz/flux</url>
++
++    <dependencies>
++        <dependency>
++            <groupId>com.github.ptgoetz</groupId>
++            <artifactId>flux-core</artifactId>
++            <version>${project.version}</version>
++        </dependency>
++        <dependency>
++            <groupId>com.github.ptgoetz</groupId>
++            <artifactId>flux-wrappers</artifactId>
++            <version>${project.version}</version>
++        </dependency>
++
++        <dependency>
++            <groupId>org.apache.storm</groupId>
++            <artifactId>storm-hdfs</artifactId>
++            <version>${storm.version}</version>
++        </dependency>
++        <dependency>
++            <groupId>org.apache.storm</groupId>
++            <artifactId>storm-hbase</artifactId>
++            <version>${storm.version}</version>
++        </dependency>
++    </dependencies>
++
++    <build>
++        <plugins>
++            <plugin>
++                <groupId>org.apache.maven.plugins</groupId>
++                <artifactId>maven-shade-plugin</artifactId>
++                <version>1.4</version>
++                <configuration>
++                    <createDependencyReducedPom>true</createDependencyReducedPom>
++                </configuration>
++                <executions>
++                    <execution>
++                        <phase>package</phase>
++                        <goals>
++                            <goal>shade</goal>
++                        </goals>
++                        <configuration>
++                            <transformers>
++                                <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
++                                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
++                                    <mainClass>org.apache.storm.flux.Flux</mainClass>
++                                </transformer>
++                            </transformers>
++                        </configuration>
++                    </execution>
++                </executions>
++            </plugin>
++        </plugins>
++    </build>
++</project>

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
index 0000000,0000000..eb4fb7a
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
@@@ -1,0 -1,0 +1,74 @@@
++/**
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.examples;
++
++import org.apache.hadoop.conf.Configuration;
++import org.apache.hadoop.hbase.HBaseConfiguration;
++import org.apache.hadoop.hbase.client.Get;
++import org.apache.hadoop.hbase.client.HTable;
++import org.apache.hadoop.hbase.client.Result;
++import org.apache.hadoop.hbase.util.Bytes;
++
++import java.io.FileInputStream;
++import java.util.Properties;
++
++/**
++ * Connects to the 'WordCount' HBase table and prints counts for each word.
++ *
++ * Assumes you have run (or are running) the YAML topology definition in
++ * <code>simple_hbase.yaml</code>
++ *
++ * You will also need to modify `src/main/resources/hbase-site.xml`
++ * to point to your HBase instance, and then repackage with `mvn package`.
++ * This is a known issue.
++ *
++ */
++public class WordCountClient {
++
++    public static void main(String[] args) throws Exception {
++        Configuration config = HBaseConfiguration.create();
++        if(args.length == 1){
++            Properties props = new Properties();
++            props.load(new FileInputStream(args[0]));
++            System.out.println("HBase configuration:");
++            for(Object key : props.keySet()) {
++                System.out.println(key + "=" + props.get(key));
++                config.set((String)key, props.getProperty((String)key));
++            }
++        } else {
++            System.out.println("Usage: WordCountClient <hbase_config.properties>");
++            System.exit(1);
++        }
++
++        HTable table = new HTable(config, "WordCount");
++        String[] words = new String[] {"nathan", "mike", "jackson", "golda", "bertels"};
++
++        for (String word : words) {
++            Get get = new Get(Bytes.toBytes(word));
++            Result result = table.get(get);
++
++            byte[] countBytes = result.getValue(Bytes.toBytes("cf"), Bytes.toBytes("count"));
++            byte[] wordBytes = result.getValue(Bytes.toBytes("cf"), Bytes.toBytes("word"));
++
++            String wordStr = Bytes.toString(wordBytes);
++            long count = Bytes.toLong(countBytes);
++            System.out.println("Word: '" + wordStr + "', Count: " + count);
++        }
++
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCounter.java
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCounter.java
index 0000000,0000000..f7c80c7
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCounter.java
@@@ -1,0 -1,0 +1,71 @@@
++/**
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.examples;
++
++import backtype.storm.task.TopologyContext;
++import backtype.storm.topology.BasicOutputCollector;
++import backtype.storm.topology.IBasicBolt;
++import backtype.storm.topology.OutputFieldsDeclarer;
++import backtype.storm.topology.base.BaseBasicBolt;
++import backtype.storm.tuple.Fields;
++import backtype.storm.tuple.Tuple;
++import org.slf4j.Logger;
++import org.slf4j.LoggerFactory;
++
++import java.util.Map;
++
++import static backtype.storm.utils.Utils.tuple;
++
++/**
++ * This bolt is used by the HBase example. It simply emits the first field
++ * found in the incoming tuple as "word", with a "count" of `1`.
++ *
++ * In this case, the downstream HBase bolt handles the counting, so a value
++ * of `1` will just increment the HBase counter by one.
++ */
++public class WordCounter extends BaseBasicBolt {
++    private static final Logger LOG = LoggerFactory.getLogger(WordCounter.class);
++
++
++
++    @SuppressWarnings("rawtypes")
++    public void prepare(Map stormConf, TopologyContext context) {
++    }
++
++    /*
++     * Just output the word value with a count of 1.
++     * The HBaseBolt will handle incrementing the counter.
++     */
++    public void execute(Tuple input, BasicOutputCollector collector) {
++        collector.emit(tuple(input.getValues().get(0), 1));
++    }
++
++    public void cleanup() {
++
++    }
++
++    public void declareOutputFields(OutputFieldsDeclarer declarer) {
++        declarer.declare(new Fields("word", "count"));
++    }
++
++    @Override
++    public Map<String, Object> getComponentConfiguration() {
++        return null;
++    }
++
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/src/main/resources/hbase_bolt.properties
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/src/main/resources/hbase_bolt.properties
index 0000000,0000000..f8ed50c
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/src/main/resources/hbase_bolt.properties
@@@ -1,0 -1,0 +1,18 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++hbase.rootdir=hdfs://hadoop:54310/hbase
++hbase.zookeeper.quorum=hadoop

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/src/main/resources/hdfs_bolt.properties
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/src/main/resources/hdfs_bolt.properties
index 0000000,0000000..7bcbe7a
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/src/main/resources/hdfs_bolt.properties
@@@ -1,0 -1,0 +1,26 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++
++# The HDFS url
++hdfs.url=hdfs://hadoop:54310
++
++# The HDFS directory where the bolt will write incoming data
++hdfs.write.dir=/incoming
++
++# The HDFS directory where files will be moved once the bolt has
++# finished writing to it.
++hdfs.dest.dir=/complete

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/src/main/resources/kafka_spout.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/src/main/resources/kafka_spout.yaml
index 0000000,0000000..8ffddc5
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/src/main/resources/kafka_spout.yaml
@@@ -1,0 -1,0 +1,136 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++
++# Test ability to wire together shell spouts/bolts
++---
++
++# topology definition
++# name to be used when submitting
++name: "kafka-topology"
++
++# Components
++# Components are analagous to Spring beans. They are meant to be used as constructor,
++# property(setter), and builder arguments.
++#
++# for the time being, components must be declared in the order they are referenced
++components:
++  - id: "stringScheme"
++    className: "storm.kafka.StringScheme"
++
++  - id: "stringMultiScheme"
++    className: "backtype.storm.spout.SchemeAsMultiScheme"
++    constructorArgs:
++      - ref: "stringScheme"
++
++  - id: "zkHosts"
++    className: "storm.kafka.ZkHosts"
++    constructorArgs:
++      - "localhost:2181"
++
++# Alternative kafka config
++#  - id: "kafkaConfig"
++#    className: "storm.kafka.KafkaConfig"
++#    constructorArgs:
++#      # brokerHosts
++#      - ref: "zkHosts"
++#      # topic
++#      - "myKafkaTopic"
++#      # clientId (optional)
++#      - "myKafkaClientId"
++
++  - id: "spoutConfig"
++    className: "storm.kafka.SpoutConfig"
++    constructorArgs:
++      # brokerHosts
++      - ref: "zkHosts"
++      # topic
++      - "myKafkaTopic"
++      # zkRoot
++      - "/kafkaSpout"
++      # id
++      - "myId"
++    properties:
++      - name: "forceFromStart"
++        value: true
++      - name: "scheme"
++        ref: "stringMultiScheme"
++
++
++
++# NOTE: We may want to consider some level of spring integration. For example, allowing component references
++# to a spring `ApplicationContext`.
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++#
++config:
++  topology.workers: 1
++  # ...
++
++# spout definitions
++spouts:
++  - id: "kafka-spout"
++    className: "storm.kafka.KafkaSpout"
++    constructorArgs:
++      - ref: "spoutConfig"
++
++# bolt definitions
++bolts:
++  - id: "splitsentence"
++    className: "org.apache.storm.flux.wrappers.bolts.FluxShellBolt"
++    constructorArgs:
++      # command line
++      - ["python", "splitsentence.py"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++    # ...
++
++  - id: "log"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++    # ...
++
++  - id: "count"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++    # ...
++
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++# custom stream groupings are also supported
++
++streams:
++  - name: "kafka --> split" # name isn't used (placeholder for logging, UI, etc.)
++    from: "kafka-spout"
++    to: "splitsentence"
++    grouping:
++      type: SHUFFLE
++
++  - name: "split --> count"
++    from: "splitsentence"
++    to: "count"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "count --> log"
++    from: "count"
++    to: "log"
++    grouping:
++      type: SHUFFLE

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/src/main/resources/multilang.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/src/main/resources/multilang.yaml
index 0000000,0000000..4f80667
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/src/main/resources/multilang.yaml
@@@ -1,0 -1,0 +1,89 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++# Test ability to wire together shell spouts/bolts
++---
++
++# topology definition
++# name to be used when submitting
++name: "shell-topology"
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++#
++config:
++  topology.workers: 1
++  # ...
++
++# spout definitions
++spouts:
++  - id: "sentence-spout"
++    className: "org.apache.storm.flux.wrappers.spouts.FluxShellSpout"
++    # shell spout constructor takes 2 arguments: String[], String[]
++    constructorArgs:
++      # command line
++      - ["node", "randomsentence.js"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++    # ...
++
++# bolt definitions
++bolts:
++  - id: "splitsentence"
++    className: "org.apache.storm.flux.wrappers.bolts.FluxShellBolt"
++    constructorArgs:
++      # command line
++      - ["python", "splitsentence.py"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++    # ...
++
++  - id: "log"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++    # ...
++
++  - id: "count"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++    # ...
++
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++# custom stream groupings are also supported
++
++streams:
++  - name: "spout --> split" # name isn't used (placeholder for logging, UI, etc.)
++    from: "sentence-spout"
++    to: "splitsentence"
++    grouping:
++      type: SHUFFLE
++
++  - name: "split --> count"
++    from: "splitsentence"
++    to: "count"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "count --> log"
++    from: "count"
++    to: "log"
++    grouping:
++      type: SHUFFLE

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/src/main/resources/simple_hbase.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/src/main/resources/simple_hbase.yaml
index 0000000,0000000..62686d0
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/src/main/resources/simple_hbase.yaml
@@@ -1,0 -1,0 +1,92 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++---
++# NOTE: To use this example, you will need to modify `src/main/resources/hbase-site.xml`
++# to point to your HBase instance, and then repackage with `mvn package`.
++# This is a known issue.
++
++# topology definition
++# name to be used when submitting
++name: "hbase-persistent-wordcount"
++
++# Components
++components:
++  - id: "columnFields"
++    className: "backtype.storm.tuple.Fields"
++    constructorArgs:
++      - ["word"]
++
++  - id: "counterFields"
++    className: "backtype.storm.tuple.Fields"
++    constructorArgs:
++      - ["count"]
++
++  - id: "mapper"
++    className: "org.apache.storm.hbase.bolt.mapper.SimpleHBaseMapper"
++    configMethods:
++      - name: "withRowKeyField"
++        args: ["word"]
++      - name: "withColumnFields"
++        args: [ref: "columnFields"]
++      - name: "withCounterFields"
++        args: [ref: "counterFields"]
++      - name: "withColumnFamily"
++        args: ["cf"]
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++config:
++  topology.workers: 1
++  hbase.conf:
++    hbase.rootdir: "${hbase.rootdir}"
++    hbase.zookeeper.quorum: "${hbase.zookeeper.quorum}"
++
++# spout definitions
++spouts:
++  - id: "word-spout"
++    className: "backtype.storm.testing.TestWordSpout"
++    parallelism: 1
++
++# bolt definitions
++
++bolts:
++  - id: "count-bolt"
++    className: "org.apache.storm.flux.examples.WordCounter"
++    parallelism: 1
++
++  - id: "hbase-bolt"
++    className: "org.apache.storm.hbase.bolt.HBaseBolt"
++    constructorArgs:
++      - "WordCount" # HBase table name
++      - ref: "mapper"
++    configMethods:
++      - name: "withConfigKey"
++        args: ["hbase.conf"]
++    parallelism: 1
++
++streams:
++  - name: "" # name isn't used (placeholder for logging, UI, etc.)
++    from: "word-spout"
++    to: "count-bolt"
++    grouping:
++      type: SHUFFLE
++
++  - name: "" # name isn't used (placeholder for logging, UI, etc.)
++    from: "count-bolt"
++    to: "hbase-bolt"
++    grouping:
++      type: FIELDS
++      args: ["word"]


[25/50] [abbrv] storm git commit: merge flux into external/flux/

Posted by pt...@apache.org.
http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
index 0000000,0000000..57237b6
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
@@@ -1,0 -1,0 +1,591 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux;
++
++import backtype.storm.Config;
++import backtype.storm.generated.StormTopology;
++import backtype.storm.grouping.CustomStreamGrouping;
++import backtype.storm.topology.*;
++import backtype.storm.tuple.Fields;
++import backtype.storm.utils.Utils;
++import org.apache.storm.flux.api.TopologySource;
++import org.apache.storm.flux.model.*;
++import org.slf4j.Logger;
++import org.slf4j.LoggerFactory;
++
++import java.lang.reflect.*;
++import java.util.ArrayList;
++import java.util.Collection;
++import java.util.List;
++import java.util.Map;
++
++public class FluxBuilder {
++    private static Logger LOG = LoggerFactory.getLogger(FluxBuilder.class);
++
++    /**
++     * Given a topology definition, return a populated `backtype.storm.Config` instance.
++     *
++     * @param topologyDef
++     * @return
++     */
++    public static Config buildConfig(TopologyDef topologyDef) {
++        // merge contents of `config` into topology config
++        Config conf = new Config();
++        conf.putAll(topologyDef.getConfig());
++        return conf;
++    }
++
++    /**
++     * Given a topology definition, return a Storm topology that can be run either locally or remotely.
++     *
++     * @param context
++     * @return
++     * @throws IllegalAccessException
++     * @throws InstantiationException
++     * @throws ClassNotFoundException
++     * @throws NoSuchMethodException
++     * @throws InvocationTargetException
++     */
++    static StormTopology buildTopology(ExecutionContext context) throws IllegalAccessException,
++            InstantiationException, ClassNotFoundException, NoSuchMethodException, InvocationTargetException {
++
++        StormTopology topology = null;
++        TopologyDef topologyDef = context.getTopologyDef();
++
++        if(!topologyDef.validate()){
++            throw new IllegalArgumentException("Invalid topology config. Spouts, bolts and streams cannot be " +
++                    "defined in the same configuration as a topologySource.");
++        }
++
++        // build components that may be referenced by spouts, bolts, etc.
++        // the map will be a String --> Object where the object is a fully
++        // constructed class instance
++        buildComponents(context);
++
++        if(topologyDef.isDslTopology()) {
++            // This is a DSL (YAML, etc.) topology...
++            LOG.info("Detected DSL topology...");
++
++            TopologyBuilder builder = new TopologyBuilder();
++
++            // create spouts
++            buildSpouts(context, builder);
++
++            // we need to be able to lookup bolts by id, then switch based
++            // on whether they are IBasicBolt or IRichBolt instances
++            buildBolts(context);
++
++            // process stream definitions
++            buildStreamDefinitions(context, builder);
++
++            topology = builder.createTopology();
++        } else {
++            // user class supplied...
++            // this also provides a bridge to Trident...
++            LOG.info("A topology source has been specified...");
++            ObjectDef def = topologyDef.getTopologySource();
++            topology = buildExternalTopology(def, context);
++        }
++        return topology;
++    }
++
++    /**
++     * Given a `java.lang.Object` instance and a method name, attempt to find a method that matches the input
++     * parameter: `java.util.Map` or `backtype.storm.Config`.
++     *
++     * @param topologySource object to inspect for the specified method
++     * @param methodName name of the method to look for
++     * @return
++     * @throws NoSuchMethodException
++     */
++    private static Method findGetTopologyMethod(Object topologySource, String methodName) throws NoSuchMethodException {
++        Class clazz = topologySource.getClass();
++        Method[] methods =  clazz.getMethods();
++        ArrayList<Method> candidates = new ArrayList<Method>();
++        for(Method method : methods){
++            if(!method.getName().equals(methodName)){
++                continue;
++            }
++            if(!method.getReturnType().equals(StormTopology.class)){
++                continue;
++            }
++            Class[] paramTypes = method.getParameterTypes();
++            if(paramTypes.length != 1){
++                continue;
++            }
++            if(paramTypes[0].isAssignableFrom(Map.class) || paramTypes[0].isAssignableFrom(Config.class)){
++                candidates.add(method);
++            }
++        }
++
++        if(candidates.size() == 0){
++            throw new IllegalArgumentException("Unable to find method '" + methodName + "' method in class: " + clazz.getName());
++        } else if (candidates.size() > 1){
++            LOG.warn("Found multiple candidate methods in class '" + clazz.getName() + "'. Using the first one found");
++        }
++
++        return candidates.get(0);
++    }
++
++    /**
++     * @param context
++     * @param builder
++     */
++    private static void buildStreamDefinitions(ExecutionContext context, TopologyBuilder builder)
++            throws ClassNotFoundException, NoSuchMethodException, InvocationTargetException, InstantiationException,
++            IllegalAccessException {
++        TopologyDef topologyDef = context.getTopologyDef();
++        // process stream definitions
++        for (StreamDef stream : topologyDef.getStreams()) {
++            Object boltObj = context.getBolt(stream.getTo());
++            BoltDeclarer declarer = null;
++            if (boltObj instanceof IRichBolt) {
++                declarer = builder.setBolt(stream.getTo(),
++                        (IRichBolt) boltObj,
++                        topologyDef.parallelismForBolt(stream.getTo()));
++            } else if (boltObj instanceof IBasicBolt) {
++                declarer = builder.setBolt(
++                        stream.getTo(),
++                        (IBasicBolt) boltObj,
++                        topologyDef.parallelismForBolt(stream.getTo()));
++            } else {
++                throw new IllegalArgumentException("Class does not appear to be a bolt: " +
++                        boltObj.getClass().getName());
++            }
++
++            GroupingDef grouping = stream.getGrouping();
++            // if the streamId is defined, use it for the grouping, otherwise assume storm's default stream
++            String streamId = (grouping.getStreamId() == null ? Utils.DEFAULT_STREAM_ID : grouping.getStreamId());
++
++
++            switch (grouping.getType()) {
++                case SHUFFLE:
++                    declarer.shuffleGrouping(stream.getFrom(), streamId);
++                    break;
++                case FIELDS:
++                    //TODO check for null grouping args
++                    declarer.fieldsGrouping(stream.getFrom(), streamId, new Fields(grouping.getArgs()));
++                    break;
++                case ALL:
++                    declarer.allGrouping(stream.getFrom(), streamId);
++                    break;
++                case DIRECT:
++                    declarer.directGrouping(stream.getFrom(), streamId);
++                    break;
++                case GLOBAL:
++                    declarer.globalGrouping(stream.getFrom(), streamId);
++                    break;
++                case LOCAL_OR_SHUFFLE:
++                    declarer.localOrShuffleGrouping(stream.getFrom(), streamId);
++                    break;
++                case NONE:
++                    declarer.noneGrouping(stream.getFrom(), streamId);
++                    break;
++                case CUSTOM:
++                    declarer.customGrouping(stream.getFrom(), streamId,
++                            buildCustomStreamGrouping(stream.getGrouping().getCustomClass(), context));
++                    break;
++                default:
++                    throw new UnsupportedOperationException("unsupported grouping type: " + grouping);
++            }
++        }
++    }
++
++    private static void applyProperties(ObjectDef bean, Object instance, ExecutionContext context) throws
++            IllegalAccessException, InvocationTargetException {
++        List<PropertyDef> props = bean.getProperties();
++        Class clazz = instance.getClass();
++        if (props != null) {
++            for (PropertyDef prop : props) {
++                Object value = prop.isReference() ? context.getComponent(prop.getRef()) : prop.getValue();
++                Method setter = findSetter(clazz, prop.getName(), value);
++                if (setter != null) {
++                    LOG.debug("found setter, attempting to invoke");
++                    // invoke setter
++                    setter.invoke(instance, new Object[]{value});
++                } else {
++                    // look for a public instance variable
++                    LOG.debug("no setter found. Looking for a public instance variable...");
++                    Field field = findPublicField(clazz, prop.getName(), value);
++                    if (field != null) {
++                        field.set(instance, value);
++                    }
++                }
++            }
++        }
++    }
++
++    private static Field findPublicField(Class clazz, String property, Object arg) {
++        Field field = null;
++        try {
++            field = clazz.getField(property);
++        } catch (NoSuchFieldException e) {
++            LOG.warn("Could not find setter or public variable for property: " + property, e);
++        }
++        return field;
++    }
++
++    private static Method findSetter(Class clazz, String property, Object arg) {
++        String setterName = toSetterName(property);
++        Method retval = null;
++        Method[] methods = clazz.getMethods();
++        for (Method method : methods) {
++            if (setterName.equals(method.getName())) {
++                LOG.debug("Found setter method: " + method.getName());
++                retval = method;
++            }
++        }
++        return retval;
++    }
++
++    private static String toSetterName(String name) {
++        return "set" + name.substring(0, 1).toUpperCase() + name.substring(1, name.length());
++    }
++
++    private static List<Object> resolveReferences(List<Object> args, ExecutionContext context) {
++        LOG.debug("Checking arguments for references.");
++        List<Object> cArgs = new ArrayList<Object>();
++        // resolve references
++        for (Object arg : args) {
++            if (arg instanceof BeanReference) {
++                cArgs.add(context.getComponent(((BeanReference) arg).getId()));
++            } else {
++                cArgs.add(arg);
++            }
++        }
++        return cArgs;
++    }
++
++    private static Object buildObject(ObjectDef def, ExecutionContext context) throws ClassNotFoundException,
++            IllegalAccessException, InstantiationException, NoSuchMethodException, InvocationTargetException {
++        Class clazz = Class.forName(def.getClassName());
++        Object obj = null;
++        if (def.hasConstructorArgs()) {
++            LOG.debug("Found constructor arguments in definition: " + def.getConstructorArgs().getClass().getName());
++            List<Object> cArgs = def.getConstructorArgs();
++            if(def.hasReferences()){
++                cArgs = resolveReferences(cArgs, context);
++            }
++            Constructor con = findCompatibleConstructor(cArgs, clazz);
++            if (con != null) {
++                LOG.debug("Found something seemingly compatible, attempting invocation...");
++                obj = con.newInstance(getArgsWithListCoercian(cArgs, con.getParameterTypes()));
++            } else {
++                String msg = String.format("Couldn't find a suitable constructor for class '%s' with arguments '%s'.",
++                        clazz.getName(),
++                        cArgs);
++                throw new IllegalArgumentException(msg);
++            }
++        } else {
++            obj = clazz.newInstance();
++        }
++        applyProperties(def, obj, context);
++        invokeConfigMethods(def, obj, context);
++        return obj;
++    }
++
++    private static StormTopology buildExternalTopology(ObjectDef def, ExecutionContext context)
++            throws ClassNotFoundException, IllegalAccessException, InstantiationException, NoSuchMethodException,
++            InvocationTargetException {
++
++        Object topologySource = buildObject(def, context);
++
++        String methodName = context.getTopologyDef().getTopologySource().getMethodName();
++        Method getTopology = findGetTopologyMethod(topologySource, methodName);
++        if(getTopology.getParameterTypes()[0].equals(Config.class)){
++            Config config = new Config();
++            config.putAll(context.getTopologyDef().getConfig());
++            return (StormTopology) getTopology.invoke(topologySource, config);
++        } else {
++            return (StormTopology) getTopology.invoke(topologySource, context.getTopologyDef().getConfig());
++        }
++    }
++
++    private static CustomStreamGrouping buildCustomStreamGrouping(ObjectDef def, ExecutionContext context)
++            throws ClassNotFoundException,
++            IllegalAccessException, InstantiationException, NoSuchMethodException, InvocationTargetException {
++        Object grouping = buildObject(def, context);
++        return (CustomStreamGrouping)grouping;
++    }
++
++    /**
++     * Given a topology definition, resolve and instantiate all components found and return a map
++     * keyed by the component id.
++     */
++    private static void buildComponents(ExecutionContext context) throws ClassNotFoundException, NoSuchMethodException,
++            IllegalAccessException, InvocationTargetException, InstantiationException {
++        Collection<BeanDef> cDefs = context.getTopologyDef().getComponents();
++        if (cDefs != null) {
++            for (BeanDef bean : cDefs) {
++                Object obj = buildObject(bean, context);
++                context.addComponent(bean.getId(), obj);
++            }
++        }
++    }
++
++
++    private static void buildSpouts(ExecutionContext context, TopologyBuilder builder) throws ClassNotFoundException,
++            NoSuchMethodException, InvocationTargetException, InstantiationException, IllegalAccessException {
++        for (SpoutDef sd : context.getTopologyDef().getSpouts()) {
++            IRichSpout spout = buildSpout(sd, context);
++            builder.setSpout(sd.getId(), spout, sd.getParallelism());
++            context.addSpout(sd.getId(), spout);
++        }
++    }
++
++    /**
++     * Given a spout definition, return a Storm spout implementation by attempting to find a matching constructor
++     * in the given spout class. Perform list to array conversion as necessary.
++     */
++    private static IRichSpout buildSpout(SpoutDef def, ExecutionContext context) throws ClassNotFoundException,
++            IllegalAccessException, InstantiationException, NoSuchMethodException, InvocationTargetException {
++        return (IRichSpout)buildObject(def, context);
++    }
++
++    /**
++     * Given a list of bolt definitions, build a map of Storm bolts with the bolt definition id as the key.
++     * Attempt to coerce the given constructor arguments to a matching bolt constructor as much as possible.
++     */
++    private static void buildBolts(ExecutionContext context) throws ClassNotFoundException, IllegalAccessException,
++            InstantiationException, NoSuchMethodException, InvocationTargetException {
++        for (BoltDef def : context.getTopologyDef().getBolts()) {
++            Class clazz = Class.forName(def.getClassName());
++            Object bolt = buildObject(def, context);
++            context.addBolt(def.getId(), bolt);
++        }
++    }
++
++    /**
++     * Given a list of constructor arguments, and a target class, attempt to find a suitable constructor.
++     *
++     */
++    private static Constructor findCompatibleConstructor(List<Object> args, Class target) throws NoSuchMethodException {
++        Constructor retval = null;
++        int eligibleCount = 0;
++
++        LOG.debug("Target class: {}", target.getName());
++        Constructor[] cons = target.getDeclaredConstructors();
++
++        for (Constructor con : cons) {
++            Class[] paramClasses = con.getParameterTypes();
++            if (paramClasses.length == args.size()) {
++                LOG.debug("found constructor with same number of args..");
++                boolean invokable = canInvokeWithArgs(args, con.getParameterTypes());
++                if (invokable) {
++                    retval = con;
++                    eligibleCount++;
++                }
++                LOG.debug("** invokable --> {}", invokable);
++            } else {
++                LOG.debug("Skipping constructor with wrong number of arguments.");
++            }
++        }
++        if (eligibleCount > 1) {
++            LOG.warn("Found multiple invokable constructors for class {}, given arguments {}. Using the last one found.",
++                    target, args);
++        }
++        return retval;
++    }
++
++
++    public static void invokeConfigMethods(ObjectDef bean, Object instance, ExecutionContext context)
++            throws InvocationTargetException, IllegalAccessException {
++
++        List<ConfigMethodDef> methodDefs = bean.getConfigMethods();
++        if(methodDefs == null || methodDefs.size() == 0){
++            return;
++        }
++        Class clazz = instance.getClass();
++        for(ConfigMethodDef methodDef : methodDefs){
++            List<Object> args = methodDef.getArgs();
++            if(methodDef.hasReferences()){
++                args = resolveReferences(args, context);
++            }
++            String methodName = methodDef.getName();
++            Method method = findCompatibleMethod(args, clazz, methodName);
++            if(method != null) {
++                Object[] methodArgs = getArgsWithListCoercian(args, method.getParameterTypes());
++                method.invoke(instance, methodArgs);
++            } else {
++                String msg = String.format("Unable to find configuration method '%s' in class '%s' with arguments %s.",
++                        new Object[]{methodName, clazz.getName(), args});
++                throw new IllegalArgumentException(msg);
++            }
++        }
++    }
++
++    private static Method findCompatibleMethod(List<Object> args, Class target, String methodName){
++        Method retval = null;
++        int eligibleCount = 0;
++
++        LOG.debug("Target class: {}", target.getName());
++        Method[] methods = target.getMethods();
++
++        for (Method method : methods) {
++            Class[] paramClasses = method.getParameterTypes();
++            if (paramClasses.length == args.size() && method.getName().equals(methodName)) {
++                LOG.debug("found constructor with same number of args..");
++                boolean invokable = canInvokeWithArgs(args, method.getParameterTypes());
++                if (invokable) {
++                    retval = method;
++                    eligibleCount++;
++                }
++                LOG.debug("** invokable --> {}", invokable);
++            } else {
++                LOG.debug("Skipping method with wrong number of arguments.");
++            }
++        }
++        if (eligibleCount > 1) {
++            LOG.warn("Found multiple invokable methods for class {}, method {}, given arguments {}. " +
++                            "Using the last one found.",
++                            new Object[]{target, methodName, args});
++        }
++        return retval;
++    }
++
++    /**
++     * Given a java.util.List of contructor/method arguments, and a list of parameter types, attempt to convert the
++     * list to an java.lang.Object array that can be used to invoke the constructor. If an argument needs
++     * to be coerced from a List to an Array, do so.
++     */
++    private static Object[] getArgsWithListCoercian(List<Object> args, Class[] parameterTypes) {
++//        Class[] parameterTypes = constructor.getParameterTypes();
++        if (parameterTypes.length != args.size()) {
++            throw new IllegalArgumentException("Contructor parameter count does not egual argument size.");
++        }
++        Object[] constructorParams = new Object[args.size()];
++
++        // loop through the arguments, if we hit a list that has to be convered to an array,
++        // perform the conversion
++        for (int i = 0; i < args.size(); i++) {
++            Object obj = args.get(i);
++            Class paramType = parameterTypes[i];
++            Class objectType = obj.getClass();
++            LOG.debug("Comparing parameter class {} to object class {} to see if assignment is possible.",
++                    paramType, objectType);
++            if (paramType.equals(objectType)) {
++                LOG.debug("They are the same class.");
++                constructorParams[i] = args.get(i);
++                continue;
++            }
++            if (paramType.isAssignableFrom(objectType)) {
++                LOG.debug("Assignment is possible.");
++                constructorParams[i] = args.get(i);
++                continue;
++            }
++            if(isPrimitiveNumber(paramType) && Number.class.isAssignableFrom(objectType)){
++                LOG.debug("Its a primitive number.");
++                Number num = (Number)args.get(i);
++                if(paramType == Float.TYPE){
++                    constructorParams[i] = num.floatValue();
++                } else if (paramType == Double.TYPE) {
++                    constructorParams[i] = num.doubleValue();
++                } else if (paramType == Long.TYPE) {
++                    constructorParams[i] = num.longValue();
++                } else if (paramType == Integer.TYPE) {
++                    constructorParams[i] = num.intValue();
++                } else if (paramType == Short.TYPE) {
++                    constructorParams[i] = num.shortValue();
++                } else if (paramType == Byte.TYPE) {
++                    constructorParams[i] = num.byteValue();
++                } else {
++                    constructorParams[i] = args.get(i);
++                }
++                continue;
++            }
++
++            // enum conversion
++            if(paramType.isEnum() && objectType.equals(String.class)){
++                LOG.debug("Yes, will convert a String to enum");
++                constructorParams[i] = Enum.valueOf(paramType, (String)args.get(i));
++                continue;
++            }
++
++            // List to array conversion
++            if (paramType.isArray() && List.class.isAssignableFrom(objectType)) {
++                // TODO more collection content type checking
++                LOG.debug("Conversion appears possible...");
++                List list = (List) obj;
++                LOG.debug("Array Type: {}, List type: {}", paramType.getComponentType(), list.get(0).getClass());
++
++                // create an array of the right type
++                Object newArrayObj = Array.newInstance(paramType.getComponentType(), list.size());
++                for (int j = 0; j < list.size(); j++) {
++                    Array.set(newArrayObj, j, list.get(j));
++
++                }
++                constructorParams[i] = newArrayObj;
++                LOG.debug("After conversion: {}", constructorParams[i]);
++            }
++        }
++        return constructorParams;
++    }
++
++
++    /**
++     * Determine if the given constructor/method parameter types are compatible given arguments List. Consider if
++     * list coercian can make it possible.
++     *
++     * @param args
++     * @param parameterTypes
++     * @return
++     */
++    private static boolean canInvokeWithArgs(List<Object> args, Class[] parameterTypes) {
++        if (parameterTypes.length != args.size()) {
++            LOG.warn("parameter types were the wrong size");
++            return false;
++        }
++
++        for (int i = 0; i < args.size(); i++) {
++            Object obj = args.get(i);
++            Class paramType = parameterTypes[i];
++            Class objectType = obj.getClass();
++            LOG.debug("Comparing parameter class {} to object class {} to see if assignment is possible.",
++                    paramType, objectType);
++            if (paramType.equals(objectType)) {
++                LOG.debug("Yes, they are the same class.");
++                return true;
++            }
++            if (paramType.isAssignableFrom(objectType)) {
++                LOG.debug("Yes, assignment is possible.");
++                return true;
++            }
++            if(isPrimitiveNumber(paramType) && Number.class.isAssignableFrom(objectType)){
++                return true;
++            }
++            if(paramType.isEnum() && objectType.equals(String.class)){
++                LOG.debug("Yes, will convert a String to enum");
++                return true;
++            }
++            if (paramType.isArray() && List.class.isAssignableFrom(objectType)) {
++                // TODO more collection content type checking
++                LOG.debug("Assignment is possible if we convert a List to an array.");
++                LOG.debug("Array Type: {}, List type: {}", paramType.getComponentType(), ((List) obj).get(0).getClass());
++
++                return true;
++            }
++            return false;
++        }
++        return false;
++    }
++
++    public static boolean isPrimitiveNumber(Class clazz){
++        return clazz.isPrimitive() && !clazz.equals(boolean.class);
++    }
++}
++

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/api/TopologySource.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/api/TopologySource.java
index 0000000,0000000..fbccfb7
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/api/TopologySource.java
@@@ -1,0 -1,0 +1,39 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.api;
++
++
++import backtype.storm.generated.StormTopology;
++
++import java.util.Map;
++
++/**
++ * Marker interface for objects that can produce `StormTopology` objects.
++ *
++ * If a `topology-source` class implements the `getTopology()` method, Flux will
++ * call that method. Otherwise, it will introspect the given class and look for a
++ * similar method that produces a `StormTopology` instance.
++ *
++ * Note that it is not strictly necessary for a class to implement this interface.
++ * If a class defines a method with a similar signature, Flux should be able to find
++ * and invoke it.
++ *
++ */
++public interface TopologySource {
++    public StormTopology getTopology(Map<String, Object> config);
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/BeanDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/BeanDef.java
index 0000000,0000000..72ca5ae
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/BeanDef.java
@@@ -1,0 -1,0 +1,39 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++import java.util.ArrayList;
++import java.util.LinkedHashMap;
++import java.util.List;
++import java.util.Map;
++
++/**
++ * A representation of a Java object that is uniquely identifyable, and given a className, constructor arguments,
++ * and properties, can be instantiated.
++ */
++public class BeanDef extends ObjectDef {
++    private String id;
++
++    public String getId() {
++        return id;
++    }
++
++    public void setId(String id) {
++        this.id = id;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/BeanReference.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/BeanReference.java
index 0000000,0000000..bd236f1
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/BeanReference.java
@@@ -1,0 -1,0 +1,39 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++/**
++ * A bean reference is simply a string pointer to another id.
++ */
++public class BeanReference {
++    public String id;
++
++    public BeanReference(){}
++
++    public BeanReference(String id){
++        this.id = id;
++    }
++
++    public String getId() {
++        return id;
++    }
++
++    public void setId(String id) {
++        this.id = id;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/BoltDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/BoltDef.java
index 0000000,0000000..362abf1
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/BoltDef.java
@@@ -1,0 -1,0 +1,24 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++/**
++ * Bean representation of a Storm bolt.
++ */
++public class BoltDef extends VertexDef {
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/ConfigMethodDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/ConfigMethodDef.java
index 0000000,0000000..6f7e4d4
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/ConfigMethodDef.java
@@@ -1,0 -1,0 +1,62 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++import java.util.ArrayList;
++import java.util.LinkedHashMap;
++import java.util.List;
++import java.util.Map;
++
++public class ConfigMethodDef {
++    private String name;
++    private List<Object> args;
++    private boolean hasReferences = false;
++
++    public String getName() {
++        return name;
++    }
++
++    public void setName(String name) {
++        this.name = name;
++    }
++
++    public List<Object> getArgs() {
++        return args;
++    }
++
++    public void setArgs(List<Object> args) {
++
++        List<Object> newVal = new ArrayList<Object>();
++        for(Object obj : args){
++            if(obj instanceof LinkedHashMap){
++                Map map = (Map)obj;
++                if(map.containsKey("ref") && map.size() == 1){
++                    newVal.add(new BeanReference((String)map.get("ref")));
++                    this.hasReferences = true;
++                }
++            } else {
++                newVal.add(obj);
++            }
++        }
++        this.args = newVal;
++    }
++
++    public boolean hasReferences(){
++        return this.hasReferences;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/ExecutionContext.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/ExecutionContext.java
index 0000000,0000000..e94b887
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/ExecutionContext.java
@@@ -1,0 -1,0 +1,77 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++import backtype.storm.Config;
++import backtype.storm.task.IBolt;
++import backtype.storm.topology.IRichSpout;
++
++import java.util.HashMap;
++import java.util.List;
++import java.util.Map;
++
++/**
++ * Container for all the objects required to instantiate a topology.
++ */
++public class ExecutionContext {
++    // parsed Topology definition
++    TopologyDef topologyDef;
++
++    // Storm config
++    private Config config;
++
++    // components
++    private List<Object> compontents;
++    // indexed by id
++    private Map<String, Object> componentMap = new HashMap<String, Object>();
++
++    private Map<String, IRichSpout> spoutMap = new HashMap<String, IRichSpout>();
++
++    private List<IBolt> bolts;
++    private Map<String, Object> boltMap = new HashMap<String, Object>();
++
++    public ExecutionContext(TopologyDef topologyDef, Config config){
++        this.topologyDef = topologyDef;
++        this.config = config;
++    }
++
++    public TopologyDef getTopologyDef(){
++        return this.topologyDef;
++    }
++
++    public void addSpout(String id, IRichSpout spout){
++        this.spoutMap.put(id, spout);
++    }
++
++    public void addBolt(String id, Object bolt){
++        this.boltMap.put(id, bolt);
++    }
++
++    public Object getBolt(String id){
++        return this.boltMap.get(id);
++    }
++
++    public void addComponent(String id, Object value){
++        this.componentMap.put(id, value);
++    }
++
++    public Object getComponent(String id){
++        return this.componentMap.get(id);
++    }
++
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/GroupingDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/GroupingDef.java
index 0000000,0000000..e4fac8e
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/GroupingDef.java
@@@ -1,0 -1,0 +1,77 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++import java.util.List;
++
++/**
++ * Bean representation of a Storm stream grouping.
++ */
++public class GroupingDef {
++
++    /**
++     * Types of stream groupings Storm allows
++     */
++    public static enum Type {
++        ALL,
++        CUSTOM,
++        DIRECT,
++        SHUFFLE,
++        LOCAL_OR_SHUFFLE,
++        FIELDS,
++        GLOBAL,
++        NONE
++    }
++
++    private Type type;
++    private String streamId;
++    private List<String> args;
++    private ObjectDef customClass;
++
++    public List<String> getArgs() {
++        return args;
++    }
++
++    public void setArgs(List<String> args) {
++        this.args = args;
++    }
++
++    public Type getType() {
++        return type;
++    }
++
++    public void setType(Type type) {
++        this.type = type;
++    }
++
++    public String getStreamId() {
++        return streamId;
++    }
++
++    public void setStreamId(String streamId) {
++        this.streamId = streamId;
++    }
++
++    public ObjectDef getCustomClass() {
++        return customClass;
++    }
++
++    public void setCustomClass(ObjectDef customClass) {
++        this.customClass = customClass;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/IncludeDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/IncludeDef.java
index 0000000,0000000..23fd9d2
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/IncludeDef.java
@@@ -1,0 -1,0 +1,54 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++/**
++ * Represents an include. Includes can be either a file or a classpath resource.
++ *
++ * If an include is marked as `override=true` then existing properties will be replaced.
++ *
++ */
++public class IncludeDef {
++    private boolean resource = false;
++    boolean override = false;
++    private String file;
++
++    public boolean isResource() {
++        return resource;
++    }
++
++    public void setResource(boolean resource) {
++        this.resource = resource;
++    }
++
++    public String getFile() {
++        return file;
++    }
++
++    public void setFile(String file) {
++        this.file = file;
++    }
++
++    public boolean isOverride() {
++        return override;
++    }
++
++    public void setOverride(boolean override) {
++        this.override = override;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/ObjectDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/ObjectDef.java
index 0000000,0000000..7386900
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/ObjectDef.java
@@@ -1,0 -1,0 +1,90 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++import backtype.storm.Config;
++
++import java.util.ArrayList;
++import java.util.LinkedHashMap;
++import java.util.List;
++import java.util.Map;
++
++/**
++ * A representation of a Java object that given a className, constructor arguments,
++ * and properties, can be instantiated.
++ */
++public class ObjectDef {
++    private String className;
++    private List<Object> constructorArgs;
++    private boolean hasReferences;
++    private List<PropertyDef> properties;
++    private List<ConfigMethodDef> configMethods;
++
++    public String getClassName() {
++        return className;
++    }
++
++    public void setClassName(String className) {
++        this.className = className;
++    }
++
++    public List<Object> getConstructorArgs() {
++        return constructorArgs;
++    }
++
++    public void setConstructorArgs(List<Object> constructorArgs) {
++
++        List<Object> newVal = new ArrayList<Object>();
++        for(Object obj : constructorArgs){
++            if(obj instanceof LinkedHashMap){
++                Map map = (Map)obj;
++                if(map.containsKey("ref") && map.size() == 1){
++                    newVal.add(new BeanReference((String)map.get("ref")));
++                    this.hasReferences = true;
++                }
++            } else {
++                newVal.add(obj);
++            }
++        }
++        this.constructorArgs = newVal;
++    }
++
++    public boolean hasConstructorArgs(){
++        return this.constructorArgs != null && this.constructorArgs.size() > 0;
++    }
++
++    public boolean hasReferences(){
++        return this.hasReferences;
++    }
++
++    public List<PropertyDef> getProperties() {
++        return properties;
++    }
++
++    public void setProperties(List<PropertyDef> properties) {
++        this.properties = properties;
++    }
++
++    public List<ConfigMethodDef> getConfigMethods() {
++        return configMethods;
++    }
++
++    public void setConfigMethods(List<ConfigMethodDef> configMethods) {
++        this.configMethods = configMethods;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/PropertyDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/PropertyDef.java
index 0000000,0000000..f3d7704
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/PropertyDef.java
@@@ -1,0 -1,0 +1,58 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++public class PropertyDef {
++    private String name;
++    private Object value;
++    private String ref;
++
++    public String getName() {
++        return name;
++    }
++
++    public void setName(String name) {
++        this.name = name;
++    }
++
++    public Object getValue() {
++        return value;
++    }
++
++    public void setValue(Object value) {
++        if(this.ref != null){
++            throw new IllegalStateException("A property can only have a value OR a reference, not both.");
++        }
++        this.value = value;
++    }
++
++    public String getRef() {
++        return ref;
++    }
++
++    public void setRef(String ref) {
++        if(this.value != null){
++            throw new IllegalStateException("A property can only have a value OR a reference, not both.");
++        }
++        this.ref = ref;
++    }
++
++    public boolean isReference(){
++        return this.ref != null;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/SpoutDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/SpoutDef.java
index 0000000,0000000..277c601
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/SpoutDef.java
@@@ -1,0 -1,0 +1,24 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++/**
++ * Bean representation of a Storm spout.
++ */
++public class SpoutDef extends VertexDef {
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/StreamDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/StreamDef.java
index 0000000,0000000..da80f1c
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/StreamDef.java
@@@ -1,0 -1,0 +1,64 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++/**
++ * Represents a stream of tuples from one Storm component (Spout or Bolt) to another (an edge in the topology DAG).
++ *
++ * Required fields are `from` and `to`, which define the source and destination, and the stream `grouping`.
++ *
++ */
++public class StreamDef {
++
++    private String name; // not used, placeholder for GUI, etc.
++    private String from;
++    private String to;
++    private GroupingDef grouping;
++
++    public String getTo() {
++        return to;
++    }
++
++    public void setTo(String to) {
++        this.to = to;
++    }
++
++    public String getName() {
++        return name;
++    }
++
++    public void setName(String name) {
++        this.name = name;
++    }
++
++    public String getFrom() {
++        return from;
++    }
++
++    public void setFrom(String from) {
++        this.from = from;
++    }
++
++    public GroupingDef getGrouping() {
++        return grouping;
++    }
++
++    public void setGrouping(GroupingDef grouping) {
++        this.grouping = grouping;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
index 0000000,0000000..a6ae450
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
@@@ -1,0 -1,0 +1,216 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++import org.slf4j.Logger;
++import org.slf4j.LoggerFactory;
++
++import java.util.*;
++
++/**
++ * Bean represenation of a topology.
++ *
++ * It consists of the following:
++ *   1. The topology name
++ *   2. A `java.util.Map` representing the `backtype.storm.config` for the topology
++ *   3. A list of spout definitions
++ *   4. A list of bolt definitions
++ *   5. A list of stream definitions that define the flow between spouts and bolts.
++ *
++ */
++public class TopologyDef {
++    private static Logger LOG = LoggerFactory.getLogger(TopologyDef.class);
++
++    private String name;
++    private Map<String, BeanDef> componentMap = new LinkedHashMap<String, BeanDef>(); // not required
++    private List<IncludeDef> includes; // not required
++    private Map<String, Object> config = new HashMap<String, Object>();
++
++    // a "topology source" is a class that can produce a `StormTopology` thrift object.
++    private TopologySourceDef topologySource;
++
++    // the following are required if we're defining a core storm topology DAG in YAML, etc.
++    private Map<String, BoltDef> boltMap = new LinkedHashMap<String, BoltDef>();
++    private Map<String, SpoutDef> spoutMap = new LinkedHashMap<String, SpoutDef>();
++    private List<StreamDef> streams = new ArrayList<StreamDef>();
++
++
++    public String getName() {
++        return name;
++    }
++
++    public void setName(String name) {
++        this.name = name;
++    }
++
++    public void setName(String name, boolean override){
++        if(this.name == null || override){
++            this.name = name;
++        } else {
++            LOG.warn("Ignoring attempt to set property 'name' with override == false.");
++        }
++    }
++
++    public List<SpoutDef> getSpouts() {
++        ArrayList<SpoutDef> retval = new ArrayList<SpoutDef>();
++        retval.addAll(this.spoutMap.values());
++        return retval;
++    }
++
++    public void setSpouts(List<SpoutDef> spouts) {
++        this.spoutMap = new LinkedHashMap<String, SpoutDef>();
++        for(SpoutDef spout : spouts){
++            this.spoutMap.put(spout.getId(), spout);
++        }
++    }
++
++    public List<BoltDef> getBolts() {
++        ArrayList<BoltDef> retval = new ArrayList<BoltDef>();
++        retval.addAll(this.boltMap.values());
++        return retval;
++    }
++
++    public void setBolts(List<BoltDef> bolts) {
++        this.boltMap = new LinkedHashMap<String, BoltDef>();
++        for(BoltDef bolt : bolts){
++            this.boltMap.put(bolt.getId(), bolt);
++        }
++    }
++
++    public List<StreamDef> getStreams() {
++        return streams;
++    }
++
++    public void setStreams(List<StreamDef> streams) {
++        this.streams = streams;
++    }
++
++    public Map<String, Object> getConfig() {
++        return config;
++    }
++
++    public void setConfig(Map<String, Object> config) {
++        this.config = config;
++    }
++
++    public List<BeanDef> getComponents() {
++        ArrayList<BeanDef> retval = new ArrayList<BeanDef>();
++        retval.addAll(this.componentMap.values());
++        return retval;
++    }
++
++    public void setComponents(List<BeanDef> components) {
++        this.componentMap = new LinkedHashMap<String, BeanDef>();
++        for(BeanDef component : components){
++            this.componentMap.put(component.getId(), component);
++        }
++    }
++
++    public List<IncludeDef> getIncludes() {
++        return includes;
++    }
++
++    public void setIncludes(List<IncludeDef> includes) {
++        this.includes = includes;
++    }
++
++    // utility methods
++    public int parallelismForBolt(String boltId){
++        return this.boltMap.get(boltId).getParallelism();
++    }
++
++    public BoltDef getBoltDef(String id){
++        return this.boltMap.get(id);
++    }
++
++    public SpoutDef getSpoutDef(String id){
++        return this.spoutMap.get(id);
++    }
++
++    public BeanDef getComponent(String id){
++        return this.componentMap.get(id);
++    }
++
++    // used by includes implementation
++    public void addAllBolts(List<BoltDef> bolts, boolean override){
++        for(BoltDef bolt : bolts){
++            String id = bolt.getId();
++            if(this.boltMap.get(id) == null || override) {
++                this.boltMap.put(bolt.getId(), bolt);
++            } else {
++                LOG.warn("Ignoring attempt to create bolt '{}' with override == false.", id);
++            }
++        }
++    }
++
++    public void addAllSpouts(List<SpoutDef> spouts, boolean override){
++        for(SpoutDef spout : spouts){
++            String id = spout.getId();
++            if(this.spoutMap.get(id) == null || override) {
++                this.spoutMap.put(spout.getId(), spout);
++            } else {
++                LOG.warn("Ignoring attempt to create spout '{}' with override == false.", id);
++            }
++        }
++    }
++
++    public void addAllComponents(List<BeanDef> components, boolean override) {
++        for(BeanDef bean : components){
++            String id = bean.getId();
++            if(this.componentMap.get(id) == null || override) {
++                this.componentMap.put(bean.getId(), bean);
++            } else {
++                LOG.warn("Ignoring attempt to create component '{}' with override == false.", id);
++            }
++        }
++    }
++
++    public void addAllStreams(List<StreamDef> streams, boolean override) {
++        //TODO figure out how we want to deal with overrides. Users may want to add streams even when overriding other
++        // properties. For now we just add them blindly which could lead to a potentially invalid topology.
++        this.streams.addAll(streams);
++    }
++
++    public TopologySourceDef getTopologySource() {
++        return topologySource;
++    }
++
++    public void setTopologySource(TopologySourceDef topologySource) {
++        this.topologySource = topologySource;
++    }
++
++    public boolean isDslTopology(){
++        return this.topologySource == null;
++    }
++
++
++    public boolean validate(){
++        boolean hasSpouts = this.spoutMap != null && this.spoutMap.size() > 0;
++        boolean hasBolts = this.boltMap != null && this.boltMap.size() > 0;
++        boolean hasStreams = this.streams != null && this.streams.size() > 0;
++        boolean hasSpoutsBoltsStreams = hasStreams && hasBolts && hasSpouts;
++        // you cant define a topologySource and a DSL topology at the same time...
++        if (!isDslTopology() && ((hasSpouts || hasBolts || hasStreams))) {
++            return false;
++        }
++        if(isDslTopology() && (hasSpouts && hasBolts && hasStreams)) {
++            return true;
++        }
++        return true;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/TopologySourceDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/TopologySourceDef.java
index 0000000,0000000..d6a2f57
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/TopologySourceDef.java
@@@ -1,0 -1,0 +1,36 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++public class TopologySourceDef extends ObjectDef {
++    public static final String DEFAULT_METHOD_NAME = "getTopology";
++
++    private String methodName;
++
++    public TopologySourceDef(){
++        this.methodName = DEFAULT_METHOD_NAME;
++    }
++
++    public String getMethodName() {
++        return methodName;
++    }
++
++    public void setMethodName(String methodName) {
++        this.methodName = methodName;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/VertexDef.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/model/VertexDef.java
index 0000000,0000000..e71bcc2
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/model/VertexDef.java
@@@ -1,0 -1,0 +1,36 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.model;
++
++/**
++ * Abstract parent class of component definitions
++ * (spouts/bolts)
++ */
++public abstract class VertexDef extends BeanDef {
++
++    // default parallelism to 1 so if it's ommitted, the topology will still function.
++    private int parallelism = 1;
++
++    public int getParallelism() {
++        return parallelism;
++    }
++
++    public void setParallelism(int parallelism) {
++        this.parallelism = parallelism;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/parser/FluxParser.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/parser/FluxParser.java
index 0000000,0000000..72f8a8e
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/parser/FluxParser.java
@@@ -1,0 -1,0 +1,202 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.parser;
++
++import org.apache.storm.flux.api.TopologySource;
++import org.apache.storm.flux.model.BoltDef;
++import org.apache.storm.flux.model.IncludeDef;
++import org.apache.storm.flux.model.SpoutDef;
++import org.apache.storm.flux.model.TopologyDef;
++import org.slf4j.Logger;
++import org.slf4j.LoggerFactory;
++import org.yaml.snakeyaml.TypeDescription;
++import org.yaml.snakeyaml.Yaml;
++import org.yaml.snakeyaml.constructor.Constructor;
++
++import java.io.ByteArrayOutputStream;
++import java.io.FileInputStream;
++import java.io.IOException;
++import java.io.InputStream;
++import java.nio.ByteBuffer;
++import java.util.Map;
++import java.util.Properties;
++
++public class FluxParser {
++    private static final Logger LOG = LoggerFactory.getLogger(FluxParser.class);
++
++    private FluxParser(){}
++
++    // TODO refactor input stream processing (see parseResource() method).
++    public static TopologyDef parseFile(String inputFile, boolean dumpYaml, boolean processIncludes,
++                                        String propertiesFile, boolean envSub) throws IOException {
++        Yaml yaml = yaml();
++        FileInputStream in = new FileInputStream(inputFile);
++        // TODO process properties, etc.
++        TopologyDef topology = loadYaml(yaml, in, propertiesFile, envSub);
++        in.close();
++        if(dumpYaml){
++            dumpYaml(topology, yaml);
++        }
++        if(processIncludes) {
++            return processIncludes(yaml, topology, propertiesFile, envSub);
++        } else {
++            return topology;
++        }
++    }
++
++    public static TopologyDef parseResource(String resource, boolean dumpYaml, boolean processIncludes,
++                                            String propertiesFile, boolean envSub) throws IOException {
++        Yaml yaml = yaml();
++        InputStream in = FluxParser.class.getResourceAsStream(resource);
++        if(in == null){
++            LOG.error("Unable to load classpath resource: " + resource);
++            System.exit(1);
++        }
++        TopologyDef topology = loadYaml(yaml, in, propertiesFile, envSub);
++        in.close();
++        if(dumpYaml){
++            dumpYaml(topology, yaml);
++        }
++        if(processIncludes) {
++            return processIncludes(yaml, topology, propertiesFile, envSub);
++        } else {
++            return topology;
++        }
++    }
++
++    private static TopologyDef loadYaml(Yaml yaml, InputStream in, String propsFile, boolean envSubstitution) throws IOException {
++        ByteArrayOutputStream bos = new ByteArrayOutputStream();
++        LOG.info("loading YAML from input stream...");
++        int b = -1;
++        while((b = in.read()) != -1){
++            bos.write(b);
++        }
++
++        // TODO substitution implementation is not exactly efficient or kind to memory...
++        String str = bos.toString();
++        // properties file substitution
++        if(propsFile != null){
++            LOG.info("Performing property substitution.");
++            InputStream propsIn = new FileInputStream(propsFile);
++            Properties props = new Properties();
++            props.load(propsIn);
++            for(Object key : props.keySet()){
++                str = str.replace("${" + key + "}", props.getProperty((String)key));
++            }
++        } else {
++            LOG.info("Not performing property substitution.");
++        }
++
++        // environment variable substitution
++        if(envSubstitution){
++            LOG.info("Performing environment variable substitution...");
++            Map<String, String> envs = System.getenv();
++            for(String key : envs.keySet()){
++                str = str.replace("${ENV-" + key + "}", envs.get(key));
++            }
++        } else {
++            LOG.info("Not performing environment variable substitution.");
++        }
++        return (TopologyDef)yaml.load(str);
++    }
++
++    private static void dumpYaml(TopologyDef topology, Yaml yaml){
++        System.out.println("Configuration (interpreted): \n" + yaml.dump(topology));
++    }
++
++    private static Yaml yaml(){
++        Constructor constructor = new Constructor(TopologyDef.class);
++
++        TypeDescription topologyDescription = new TypeDescription(TopologyDef.class);
++        topologyDescription.putListPropertyType("spouts", SpoutDef.class);
++        topologyDescription.putListPropertyType("bolts", BoltDef.class);
++        topologyDescription.putListPropertyType("includes", IncludeDef.class);
++        constructor.addTypeDescription(topologyDescription);
++
++        Yaml  yaml = new Yaml(constructor);
++        return yaml;
++    }
++
++    /**
++     *
++     * @param yaml the yaml parser for parsing the include file(s)
++     * @param topologyDef the topology definition containing (possibly zero) includes
++     * @return The TopologyDef with includes resolved.
++     */
++    private static TopologyDef processIncludes(Yaml yaml, TopologyDef topologyDef, String propsFile, boolean envSub)
++            throws IOException {
++        //TODO support multiple levels of includes
++        if(topologyDef.getIncludes() != null) {
++            for (IncludeDef include : topologyDef.getIncludes()){
++                TopologyDef includeTopologyDef = null;
++                if (include.isResource()) {
++                    LOG.info("Loading includes from resource: {}", include.getFile());
++                    includeTopologyDef = parseResource(include.getFile(), true, false, propsFile, envSub);
++                } else {
++                    LOG.info("Loading includes from file: {}", include.getFile());
++                    includeTopologyDef = parseFile(include.getFile(), true, false, propsFile, envSub);
++                }
++
++                // if overrides are disabled, we won't replace anything that already exists
++                boolean override = include.isOverride();
++                // name
++                if(includeTopologyDef.getName() != null){
++                    topologyDef.setName(includeTopologyDef.getName(), override);
++                }
++
++                // config
++                if(includeTopologyDef.getConfig() != null) {
++                    //TODO move this logic to the model class
++                    Map<String, Object> config = topologyDef.getConfig();
++                    Map<String, Object> includeConfig = includeTopologyDef.getConfig();
++                    if(override) {
++                        config.putAll(includeTopologyDef.getConfig());
++                    } else {
++                        for(String key : includeConfig.keySet()){
++                            if(config.containsKey(key)){
++                                LOG.warn("Ignoring attempt to set topology config property '{}' with override == false", key);
++                            }
++                            else {
++                                config.put(key, includeConfig.get(key));
++                            }
++                        }
++                    }
++                }
++
++                //component overrides
++                if(includeTopologyDef.getComponents() != null){
++                    topologyDef.addAllComponents(includeTopologyDef.getComponents(), override);
++                }
++                //bolt overrides
++                if(includeTopologyDef.getBolts() != null){
++                    topologyDef.addAllBolts(includeTopologyDef.getBolts(), override);
++                }
++                //spout overrides
++                if(includeTopologyDef.getSpouts() != null) {
++                    topologyDef.addAllSpouts(includeTopologyDef.getSpouts(), override);
++                }
++                //stream overrides
++                //TODO streams should be uniquely identifiable
++                if(includeTopologyDef.getStreams() != null) {
++                    topologyDef.addAllStreams(includeTopologyDef.getStreams(), override);
++                }
++            } // end include processing
++        }
++        return topologyDef;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/resources/splash.txt
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/resources/splash.txt
index 0000000,0000000..337931a
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/resources/splash.txt
@@@ -1,0 -1,0 +1,9 @@@
++███████╗██╗     ██╗   ██╗██╗  ██╗
++██╔════╝██║     ██║   ██║╚██╗██╔╝
++█████╗  ██║     ██║   ██║ ╚███╔╝
++██╔══╝  ██║     ██║   ██║ ██╔██╗
++██║     ███████╗╚██████╔╝██╔╝ ██╗
++╚═╝     ╚══════╝ ╚═════╝ ╚═╝  ╚═╝
+++-         Apache Storm        -+
+++-  data FLow User eXperience  -+
++Version: ${project.version}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/java/org/apache/storm/flux/FluxBuilderTest.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/java/org/apache/storm/flux/FluxBuilderTest.java
index 0000000,0000000..ff67867
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/FluxBuilderTest.java
@@@ -1,0 -1,0 +1,31 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux;
++
++import org.junit.Test;
++import static org.junit.Assert.*;
++
++public class FluxBuilderTest {
++
++    @Test
++    public void testIsPrimitiveNumber() throws Exception {
++        assertTrue(FluxBuilder.isPrimitiveNumber(int.class));
++        assertFalse(FluxBuilder.isPrimitiveNumber(boolean.class));
++        assertFalse(FluxBuilder.isPrimitiveNumber(String.class));
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
index 0000000,0000000..5e17f5e
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
@@@ -1,0 -1,0 +1,41 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux;
++
++import org.junit.Test;
++
++public class IntegrationTest {
++
++    private static boolean skipTest = true;
++
++    static {
++        String skipStr = System.getProperty("skipIntegration");
++        if(skipStr != null && skipStr.equalsIgnoreCase("false")){
++            skipTest = false;
++        }
++    }
++
++
++
++    @Test
++    public void testRunTopologySource() throws Exception {
++        if(!skipTest) {
++            Flux.main(new String[]{"-s", "30000", "src/test/resources/configs/existing-topology.yaml"});
++        }
++    }
++}


[11/50] [abbrv] storm git commit: add basic docs for HBase example

Posted by pt...@apache.org.
add basic docs for HBase example


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/edc5744c
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/edc5744c
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/edc5744c

Branch: refs/heads/0.10.x-branch
Commit: edc5744c48b65c3d7298ec5a61affb6164fc4d96
Parents: 4a1db96
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue Apr 7 23:57:53 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue Apr 7 23:57:53 2015 -0400

----------------------------------------------------------------------
 flux-examples/README.md | 12 ++++++++++++
 1 file changed, 12 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/edc5744c/flux-examples/README.md
----------------------------------------------------------------------
diff --git a/flux-examples/README.md b/flux-examples/README.md
index 425ad98..b3798a6 100644
--- a/flux-examples/README.md
+++ b/flux-examples/README.md
@@ -54,3 +54,15 @@ least, the property `hdfs.url` to point to a HDFS cluster. Then you can run the
 storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_hdfs.yaml --filter my_hdfs_bolt.properties
 ```
 
+### [simple_hbase.yaml](src/main/resources/simple_hbase.yaml)
+
+This example illustrates how to use Flux to setup a storm-hbase bolt to write to HBase.
+
+In order to use this example, you will need to edit the `src/main resrouces/hbase-site.xml` file to reflect your HBase
+environment, and then rebuild the topology jar.
+
+You can do so by running the following Maven command in the `flux-examples` directory:
+
+```bash
+mvn clean install
+```
\ No newline at end of file


[48/50] [abbrv] storm git commit: add missing license headers and clean up RAT report

Posted by pt...@apache.org.
add missing license headers and clean up RAT report


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/2154048f
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/2154048f
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/2154048f

Branch: refs/heads/0.10.x-branch
Commit: 2154048fd123a0960a4f5854ec4dd436bb775329
Parents: 285d943
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Jun 3 17:38:14 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Jun 3 17:38:14 2015 -0400

----------------------------------------------------------------------
 .travis.yml                                     |  11 +
 LICENSE                                         |   4 +-
 conf/jaas_kerberos.conf                         |  17 ++
 dev-tools/test-ns.py                            |  15 ++
 doap_Storm.rdf                                  |   6 +-
 external/flux/LICENSE                           | 202 -------------------
 .../apache/storm/flux/test/SimpleTopology.java  |  17 ++
 .../storm/flux/test/SimpleTopologySource.java   |  17 ++
 .../test/SimpleTopologyWithConfigParam.java     |  17 ++
 .../org/apache/storm/flux/test/TestBolt.java    |  17 ++
 .../storm/flux/test/TridentTopologySource.java  |  17 ++
 .../existing-topology-method-override.yaml      |  15 ++
 .../existing-topology-reflection-config.yaml    |  15 ++
 .../configs/existing-topology-reflection.yaml   |  15 ++
 .../configs/existing-topology-trident.yaml      |  15 ++
 .../resources/configs/existing-topology.yaml    |  15 ++
 .../configs/invalid-existing-topology.yaml      |  16 ++
 .../src/test/resources/configs/test.properties  |  16 ++
 .../src/main/resources/config.properties        |  15 ++
 external/storm-hbase/LICENSE                    | 202 -------------------
 .../storm/hive/trident/HiveStateFactory.java    |  17 ++
 .../apache/storm/hive/trident/HiveUpdater.java  |  17 ++
 external/storm-jdbc/LICENSE                     | 202 -------------------
 .../storm/jdbc/common/ConnectionProvider.java   |  17 ++
 .../jdbc/common/HikariCPConnectionProvider.java |  17 ++
 .../storm/jdbc/mapper/JdbcLookupMapper.java     |  17 ++
 .../jdbc/mapper/SimpleJdbcLookupMapper.java     |  17 ++
 external/storm-jdbc/src/test/sql/test.sql       |  17 ++
 .../ExponentialBackoffMsgRetryManagerTest.java  |  17 ++
 external/storm-redis/LICENSE                    | 202 -------------------
 .../redis/trident/WordCountLookupMapper.java    |  17 ++
 .../redis/trident/WordCountStoreMapper.java     |  17 ++
 pom.xml                                         |  17 +-
 storm-core/src/clj/backtype/storm/converter.clj |  15 ++
 .../src/dev/drpc-simple-acl-test-scenario.yaml  |  17 ++
 .../storm/messaging/ConnectionWithStatus.java   |  17 ++
 .../auth/authorizer/DRPCAuthorizerBase.java     |  17 ++
 .../authorizer/DRPCSimpleACLAuthorizer.java     |  18 ++
 .../authorizer/ImpersonationAuthorizer.java     |  17 ++
 .../auth/kerberos/jaas_kerberos_cluster.conf    |  20 +-
 .../auth/kerberos/jaas_kerberos_launcher.conf   |  19 ++
 .../worker-launcher/.deps/worker-launcher.Po    |  16 ++
 .../auth/DefaultHttpCredentialsPlugin_test.clj  |  15 ++
 .../authorizer/DRPCSimpleACLAuthorizer_test.clj |  15 ++
 .../storm/security/auth/drpc-auth-alice.jaas    |  17 ++
 .../storm/security/auth/drpc-auth-bob.jaas      |  17 ++
 .../storm/security/auth/drpc-auth-charlie.jaas  |  17 ++
 .../storm/security/auth/drpc-auth-server.jaas   |  17 ++
 48 files changed, 692 insertions(+), 817 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/.travis.yml
----------------------------------------------------------------------
diff --git a/.travis.yml b/.travis.yml
index 99da952..27484eb 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -1,3 +1,14 @@
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
 language: java
 jdk:
   - oraclejdk7

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/LICENSE
----------------------------------------------------------------------
diff --git a/LICENSE b/LICENSE
index f468287..8755d1b 100644
--- a/LICENSE
+++ b/LICENSE
@@ -477,8 +477,8 @@ THE SOFTWARE.
 
 For jquery dataTables bootstrap integration
 
-(storm-core/src/ui/public/js/dataTables.bootstrap.min.js
-storm-core/src/ui/public/css/dataTables.bootstrap.css)
+(storm-core/src/ui/public/js/jquery.dataTables.1.10.4.min.js
+torm-core/src/ui/public/css/dataTables.bootstrap.css)
 
 Copyright (c) 2013-2014 SpryMedia Limited
 http://datatables.net/license

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/conf/jaas_kerberos.conf
----------------------------------------------------------------------
diff --git a/conf/jaas_kerberos.conf b/conf/jaas_kerberos.conf
index 5861df2..87e8d0d 100644
--- a/conf/jaas_kerberos.conf
+++ b/conf/jaas_kerberos.conf
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 StormServer {
        com.sun.security.auth.module.Krb5LoginModule required
        useKeyTab=true

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/dev-tools/test-ns.py
----------------------------------------------------------------------
diff --git a/dev-tools/test-ns.py b/dev-tools/test-ns.py
index 87eca14..2fd1421 100755
--- a/dev-tools/test-ns.py
+++ b/dev-tools/test-ns.py
@@ -1,4 +1,19 @@
 #!/usr/bin/env python
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
 
 from subprocess import Popen, PIPE
 import sys

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/doap_Storm.rdf
----------------------------------------------------------------------
diff --git a/doap_Storm.rdf b/doap_Storm.rdf
index 9be2d2a..407f28e 100644
--- a/doap_Storm.rdf
+++ b/doap_Storm.rdf
@@ -36,9 +36,9 @@
     <category rdf:resource="http://projects.apache.org/category/big-data" />
     <release>
       <Version>
-        <name>Storm 0.9.2-incubating</name>
-        <created>2014-06-25</created>
-        <revision>0.9.2-incubating</revision>
+        <name>Apache Storm 0.9.5</name>
+        <created>2015-06-03</created>
+        <revision>0.9.5</revision>
       </Version>
     </release>
     <repository>

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/LICENSE
----------------------------------------------------------------------
diff --git a/external/flux/LICENSE b/external/flux/LICENSE
deleted file mode 100644
index e06d208..0000000
--- a/external/flux/LICENSE
+++ /dev/null
@@ -1,202 +0,0 @@
-Apache License
-                           Version 2.0, January 2004
-                        http://www.apache.org/licenses/
-
-   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-   1. Definitions.
-
-      "License" shall mean the terms and conditions for use, reproduction,
-      and distribution as defined by Sections 1 through 9 of this document.
-
-      "Licensor" shall mean the copyright owner or entity authorized by
-      the copyright owner that is granting the License.
-
-      "Legal Entity" shall mean the union of the acting entity and all
-      other entities that control, are controlled by, or are under common
-      control with that entity. For the purposes of this definition,
-      "control" means (i) the power, direct or indirect, to cause the
-      direction or management of such entity, whether by contract or
-      otherwise, or (ii) ownership of fifty percent (50%) or more of the
-      outstanding shares, or (iii) beneficial ownership of such entity.
-
-      "You" (or "Your") shall mean an individual or Legal Entity
-      exercising permissions granted by this License.
-
-      "Source" form shall mean the preferred form for making modifications,
-      including but not limited to software source code, documentation
-      source, and configuration files.
-
-      "Object" form shall mean any form resulting from mechanical
-      transformation or translation of a Source form, including but
-      not limited to compiled object code, generated documentation,
-      and conversions to other media types.
-
-      "Work" shall mean the work of authorship, whether in Source or
-      Object form, made available under the License, as indicated by a
-      copyright notice that is included in or attached to the work
-      (an example is provided in the Appendix below).
-
-      "Derivative Works" shall mean any work, whether in Source or Object
-      form, that is based on (or derived from) the Work and for which the
-      editorial revisions, annotations, elaborations, or other modifications
-      represent, as a whole, an original work of authorship. For the purposes
-      of this License, Derivative Works shall not include works that remain
-      separable from, or merely link (or bind by name) to the interfaces of,
-      the Work and Derivative Works thereof.
-
-      "Contribution" shall mean any work of authorship, including
-      the original version of the Work and any modifications or additions
-      to that Work or Derivative Works thereof, that is intentionally
-      submitted to Licensor for inclusion in the Work by the copyright owner
-      or by an individual or Legal Entity authorized to submit on behalf of
-      the copyright owner. For the purposes of this definition, "submitted"
-      means any form of electronic, verbal, or written communication sent
-      to the Licensor or its representatives, including but not limited to
-      communication on electronic mailing lists, source code control systems,
-      and issue tracking systems that are managed by, or on behalf of, the
-      Licensor for the purpose of discussing and improving the Work, but
-      excluding communication that is conspicuously marked or otherwise
-      designated in writing by the copyright owner as "Not a Contribution."
-
-      "Contributor" shall mean Licensor and any individual or Legal Entity
-      on behalf of whom a Contribution has been received by Licensor and
-      subsequently incorporated within the Work.
-
-   2. Grant of Copyright License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      copyright license to reproduce, prepare Derivative Works of,
-      publicly display, publicly perform, sublicense, and distribute the
-      Work and such Derivative Works in Source or Object form.
-
-   3. Grant of Patent License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      (except as stated in this section) patent license to make, have made,
-      use, offer to sell, sell, import, and otherwise transfer the Work,
-      where such license applies only to those patent claims licensable
-      by such Contributor that are necessarily infringed by their
-      Contribution(s) alone or by combination of their Contribution(s)
-      with the Work to which such Contribution(s) was submitted. If You
-      institute patent litigation against any entity (including a
-      cross-claim or counterclaim in a lawsuit) alleging that the Work
-      or a Contribution incorporated within the Work constitutes direct
-      or contributory patent infringement, then any patent licenses
-      granted to You under this License for that Work shall terminate
-      as of the date such litigation is filed.
-
-   4. Redistribution. You may reproduce and distribute copies of the
-      Work or Derivative Works thereof in any medium, with or without
-      modifications, and in Source or Object form, provided that You
-      meet the following conditions:
-
-      (a) You must give any other recipients of the Work or
-          Derivative Works a copy of this License; and
-
-      (b) You must cause any modified files to carry prominent notices
-          stating that You changed the files; and
-
-      (c) You must retain, in the Source form of any Derivative Works
-          that You distribute, all copyright, patent, trademark, and
-          attribution notices from the Source form of the Work,
-          excluding those notices that do not pertain to any part of
-          the Derivative Works; and
-
-      (d) If the Work includes a "NOTICE" text file as part of its
-          distribution, then any Derivative Works that You distribute must
-          include a readable copy of the attribution notices contained
-          within such NOTICE file, excluding those notices that do not
-          pertain to any part of the Derivative Works, in at least one
-          of the following places: within a NOTICE text file distributed
-          as part of the Derivative Works; within the Source form or
-          documentation, if provided along with the Derivative Works; or,
-          within a display generated by the Derivative Works, if and
-          wherever such third-party notices normally appear. The contents
-          of the NOTICE file are for informational purposes only and
-          do not modify the License. You may add Your own attribution
-          notices within Derivative Works that You distribute, alongside
-          or as an addendum to the NOTICE text from the Work, provided
-          that such additional attribution notices cannot be construed
-          as modifying the License.
-
-      You may add Your own copyright statement to Your modifications and
-      may provide additional or different license terms and conditions
-      for use, reproduction, or distribution of Your modifications, or
-      for any such Derivative Works as a whole, provided Your use,
-      reproduction, and distribution of the Work otherwise complies with
-      the conditions stated in this License.
-
-   5. Submission of Contributions. Unless You explicitly state otherwise,
-      any Contribution intentionally submitted for inclusion in the Work
-      by You to the Licensor shall be under the terms and conditions of
-      this License, without any additional terms or conditions.
-      Notwithstanding the above, nothing herein shall supersede or modify
-      the terms of any separate license agreement you may have executed
-      with Licensor regarding such Contributions.
-
-   6. Trademarks. This License does not grant permission to use the trade
-      names, trademarks, service marks, or product names of the Licensor,
-      except as required for reasonable and customary use in describing the
-      origin of the Work and reproducing the content of the NOTICE file.
-
-   7. Disclaimer of Warranty. Unless required by applicable law or
-      agreed to in writing, Licensor provides the Work (and each
-      Contributor provides its Contributions) on an "AS IS" BASIS,
-      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
-      implied, including, without limitation, any warranties or conditions
-      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
-      PARTICULAR PURPOSE. You are solely responsible for determining the
-      appropriateness of using or redistributing the Work and assume any
-      risks associated with Your exercise of permissions under this License.
-
-   8. Limitation of Liability. In no event and under no legal theory,
-      whether in tort (including negligence), contract, or otherwise,
-      unless required by applicable law (such as deliberate and grossly
-      negligent acts) or agreed to in writing, shall any Contributor be
-      liable to You for damages, including any direct, indirect, special,
-      incidental, or consequential damages of any character arising as a
-      result of this License or out of the use or inability to use the
-      Work (including but not limited to damages for loss of goodwill,
-      work stoppage, computer failure or malfunction, or any and all
-      other commercial damages or losses), even if such Contributor
-      has been advised of the possibility of such damages.
-
-   9. Accepting Warranty or Additional Liability. While redistributing
-      the Work or Derivative Works thereof, You may choose to offer,
-      and charge a fee for, acceptance of support, warranty, indemnity,
-      or other liability obligations and/or rights consistent with this
-      License. However, in accepting such obligations, You may act only
-      on Your own behalf and on Your sole responsibility, not on behalf
-      of any other Contributor, and only if You agree to indemnify,
-      defend, and hold each Contributor harmless for any liability
-      incurred by, or claims asserted against, such Contributor by reason
-      of your accepting any such warranty or additional liability.
-
-   END OF TERMS AND CONDITIONS
-
-   APPENDIX: How to apply the Apache License to your work.
-
-      To apply the Apache License to your work, attach the following
-      boilerplate notice, with the fields enclosed by brackets "{}"
-      replaced with your own identifying information. (Don't include
-      the brackets!)  The text should be enclosed in the appropriate
-      comment syntax for the file format. We also recommend that a
-      file or class name and description of purpose be included on the
-      same "printed page" as the copyright notice for easier
-      identification within third-party archives.
-
-   Copyright {yyyy} {name of copyright owner}
-
-   Licensed under the Apache License, Version 2.0 (the "License");
-   you may not use this file except in compliance with the License.
-   You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopology.java
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopology.java b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopology.java
index 0d37997..981d6b0 100644
--- a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopology.java
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopology.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.flux.test;
 
 import backtype.storm.generated.StormTopology;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologySource.java
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologySource.java b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologySource.java
index 2007082..61eb113 100644
--- a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologySource.java
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologySource.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.flux.test;
 
 import backtype.storm.generated.StormTopology;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologyWithConfigParam.java
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologyWithConfigParam.java b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologyWithConfigParam.java
index f29b543..39e2e3d 100644
--- a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologyWithConfigParam.java
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/SimpleTopologyWithConfigParam.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.flux.test;
 
 import backtype.storm.Config;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TestBolt.java
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TestBolt.java b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TestBolt.java
index e88f2cf..7f11460 100644
--- a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TestBolt.java
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TestBolt.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.flux.test;
 
 import backtype.storm.topology.BasicOutputCollector;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TridentTopologySource.java
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TridentTopologySource.java b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TridentTopologySource.java
index 3cb6634..24cee7d 100644
--- a/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TridentTopologySource.java
+++ b/external/flux/flux-core/src/test/java/org/apache/storm/flux/test/TridentTopologySource.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.flux.test;
 
 import backtype.storm.Config;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/resources/configs/existing-topology-method-override.yaml
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/resources/configs/existing-topology-method-override.yaml b/external/flux/flux-core/src/test/resources/configs/existing-topology-method-override.yaml
index 6f3c88a..fceeeed 100644
--- a/external/flux/flux-core/src/test/resources/configs/existing-topology-method-override.yaml
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology-method-override.yaml
@@ -1,3 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
 ---
 
 # configuration that uses an existing topology that does not implement TopologySource

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection-config.yaml
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection-config.yaml b/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection-config.yaml
index 8af8a84..440fe4d 100644
--- a/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection-config.yaml
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection-config.yaml
@@ -1,3 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
 ---
 
 # configuration that uses an existing topology that does not implement TopologySource

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection.yaml
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection.yaml b/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection.yaml
index dd3e1e8..975885b 100644
--- a/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection.yaml
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology-reflection.yaml
@@ -1,3 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
 ---
 
 # configuration that uses an existing topology that does not implement TopologySource

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/resources/configs/existing-topology-trident.yaml
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/resources/configs/existing-topology-trident.yaml b/external/flux/flux-core/src/test/resources/configs/existing-topology-trident.yaml
index 5ac682c..978181b 100644
--- a/external/flux/flux-core/src/test/resources/configs/existing-topology-trident.yaml
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology-trident.yaml
@@ -1,3 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
 ---
 
 name: "existing-topology"

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/resources/configs/existing-topology.yaml
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/resources/configs/existing-topology.yaml b/external/flux/flux-core/src/test/resources/configs/existing-topology.yaml
index fa6a0b3..e112c0f 100644
--- a/external/flux/flux-core/src/test/resources/configs/existing-topology.yaml
+++ b/external/flux/flux-core/src/test/resources/configs/existing-topology.yaml
@@ -1,3 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
 ---
 
 name: "existing-topology"

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/resources/configs/invalid-existing-topology.yaml
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/resources/configs/invalid-existing-topology.yaml b/external/flux/flux-core/src/test/resources/configs/invalid-existing-topology.yaml
index 72128df..c2dfac0 100644
--- a/external/flux/flux-core/src/test/resources/configs/invalid-existing-topology.yaml
+++ b/external/flux/flux-core/src/test/resources/configs/invalid-existing-topology.yaml
@@ -1,3 +1,19 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
 # This is an invalid config. It defines both a topologySource and a list of spouts.
 ---
 

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/flux/flux-core/src/test/resources/configs/test.properties
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/src/test/resources/configs/test.properties b/external/flux/flux-core/src/test/resources/configs/test.properties
index 0730d5f..ecd89d9 100644
--- a/external/flux/flux-core/src/test/resources/configs/test.properties
+++ b/external/flux/flux-core/src/test/resources/configs/test.properties
@@ -1,2 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
 topology.name: substitution-topology
 some.other.property: foo bar
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-eventhubs/src/main/resources/config.properties
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/resources/config.properties b/external/storm-eventhubs/src/main/resources/config.properties
index a8a520e..2062e24 100755
--- a/external/storm-eventhubs/src/main/resources/config.properties
+++ b/external/storm-eventhubs/src/main/resources/config.properties
@@ -1,3 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
 eventhubspout.username = [username]
 
 eventhubspout.password = [password]

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-hbase/LICENSE
----------------------------------------------------------------------
diff --git a/external/storm-hbase/LICENSE b/external/storm-hbase/LICENSE
deleted file mode 100644
index e06d208..0000000
--- a/external/storm-hbase/LICENSE
+++ /dev/null
@@ -1,202 +0,0 @@
-Apache License
-                           Version 2.0, January 2004
-                        http://www.apache.org/licenses/
-
-   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-   1. Definitions.
-
-      "License" shall mean the terms and conditions for use, reproduction,
-      and distribution as defined by Sections 1 through 9 of this document.
-
-      "Licensor" shall mean the copyright owner or entity authorized by
-      the copyright owner that is granting the License.
-
-      "Legal Entity" shall mean the union of the acting entity and all
-      other entities that control, are controlled by, or are under common
-      control with that entity. For the purposes of this definition,
-      "control" means (i) the power, direct or indirect, to cause the
-      direction or management of such entity, whether by contract or
-      otherwise, or (ii) ownership of fifty percent (50%) or more of the
-      outstanding shares, or (iii) beneficial ownership of such entity.
-
-      "You" (or "Your") shall mean an individual or Legal Entity
-      exercising permissions granted by this License.
-
-      "Source" form shall mean the preferred form for making modifications,
-      including but not limited to software source code, documentation
-      source, and configuration files.
-
-      "Object" form shall mean any form resulting from mechanical
-      transformation or translation of a Source form, including but
-      not limited to compiled object code, generated documentation,
-      and conversions to other media types.
-
-      "Work" shall mean the work of authorship, whether in Source or
-      Object form, made available under the License, as indicated by a
-      copyright notice that is included in or attached to the work
-      (an example is provided in the Appendix below).
-
-      "Derivative Works" shall mean any work, whether in Source or Object
-      form, that is based on (or derived from) the Work and for which the
-      editorial revisions, annotations, elaborations, or other modifications
-      represent, as a whole, an original work of authorship. For the purposes
-      of this License, Derivative Works shall not include works that remain
-      separable from, or merely link (or bind by name) to the interfaces of,
-      the Work and Derivative Works thereof.
-
-      "Contribution" shall mean any work of authorship, including
-      the original version of the Work and any modifications or additions
-      to that Work or Derivative Works thereof, that is intentionally
-      submitted to Licensor for inclusion in the Work by the copyright owner
-      or by an individual or Legal Entity authorized to submit on behalf of
-      the copyright owner. For the purposes of this definition, "submitted"
-      means any form of electronic, verbal, or written communication sent
-      to the Licensor or its representatives, including but not limited to
-      communication on electronic mailing lists, source code control systems,
-      and issue tracking systems that are managed by, or on behalf of, the
-      Licensor for the purpose of discussing and improving the Work, but
-      excluding communication that is conspicuously marked or otherwise
-      designated in writing by the copyright owner as "Not a Contribution."
-
-      "Contributor" shall mean Licensor and any individual or Legal Entity
-      on behalf of whom a Contribution has been received by Licensor and
-      subsequently incorporated within the Work.
-
-   2. Grant of Copyright License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      copyright license to reproduce, prepare Derivative Works of,
-      publicly display, publicly perform, sublicense, and distribute the
-      Work and such Derivative Works in Source or Object form.
-
-   3. Grant of Patent License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      (except as stated in this section) patent license to make, have made,
-      use, offer to sell, sell, import, and otherwise transfer the Work,
-      where such license applies only to those patent claims licensable
-      by such Contributor that are necessarily infringed by their
-      Contribution(s) alone or by combination of their Contribution(s)
-      with the Work to which such Contribution(s) was submitted. If You
-      institute patent litigation against any entity (including a
-      cross-claim or counterclaim in a lawsuit) alleging that the Work
-      or a Contribution incorporated within the Work constitutes direct
-      or contributory patent infringement, then any patent licenses
-      granted to You under this License for that Work shall terminate
-      as of the date such litigation is filed.
-
-   4. Redistribution. You may reproduce and distribute copies of the
-      Work or Derivative Works thereof in any medium, with or without
-      modifications, and in Source or Object form, provided that You
-      meet the following conditions:
-
-      (a) You must give any other recipients of the Work or
-          Derivative Works a copy of this License; and
-
-      (b) You must cause any modified files to carry prominent notices
-          stating that You changed the files; and
-
-      (c) You must retain, in the Source form of any Derivative Works
-          that You distribute, all copyright, patent, trademark, and
-          attribution notices from the Source form of the Work,
-          excluding those notices that do not pertain to any part of
-          the Derivative Works; and
-
-      (d) If the Work includes a "NOTICE" text file as part of its
-          distribution, then any Derivative Works that You distribute must
-          include a readable copy of the attribution notices contained
-          within such NOTICE file, excluding those notices that do not
-          pertain to any part of the Derivative Works, in at least one
-          of the following places: within a NOTICE text file distributed
-          as part of the Derivative Works; within the Source form or
-          documentation, if provided along with the Derivative Works; or,
-          within a display generated by the Derivative Works, if and
-          wherever such third-party notices normally appear. The contents
-          of the NOTICE file are for informational purposes only and
-          do not modify the License. You may add Your own attribution
-          notices within Derivative Works that You distribute, alongside
-          or as an addendum to the NOTICE text from the Work, provided
-          that such additional attribution notices cannot be construed
-          as modifying the License.
-
-      You may add Your own copyright statement to Your modifications and
-      may provide additional or different license terms and conditions
-      for use, reproduction, or distribution of Your modifications, or
-      for any such Derivative Works as a whole, provided Your use,
-      reproduction, and distribution of the Work otherwise complies with
-      the conditions stated in this License.
-
-   5. Submission of Contributions. Unless You explicitly state otherwise,
-      any Contribution intentionally submitted for inclusion in the Work
-      by You to the Licensor shall be under the terms and conditions of
-      this License, without any additional terms or conditions.
-      Notwithstanding the above, nothing herein shall supersede or modify
-      the terms of any separate license agreement you may have executed
-      with Licensor regarding such Contributions.
-
-   6. Trademarks. This License does not grant permission to use the trade
-      names, trademarks, service marks, or product names of the Licensor,
-      except as required for reasonable and customary use in describing the
-      origin of the Work and reproducing the content of the NOTICE file.
-
-   7. Disclaimer of Warranty. Unless required by applicable law or
-      agreed to in writing, Licensor provides the Work (and each
-      Contributor provides its Contributions) on an "AS IS" BASIS,
-      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
-      implied, including, without limitation, any warranties or conditions
-      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
-      PARTICULAR PURPOSE. You are solely responsible for determining the
-      appropriateness of using or redistributing the Work and assume any
-      risks associated with Your exercise of permissions under this License.
-
-   8. Limitation of Liability. In no event and under no legal theory,
-      whether in tort (including negligence), contract, or otherwise,
-      unless required by applicable law (such as deliberate and grossly
-      negligent acts) or agreed to in writing, shall any Contributor be
-      liable to You for damages, including any direct, indirect, special,
-      incidental, or consequential damages of any character arising as a
-      result of this License or out of the use or inability to use the
-      Work (including but not limited to damages for loss of goodwill,
-      work stoppage, computer failure or malfunction, or any and all
-      other commercial damages or losses), even if such Contributor
-      has been advised of the possibility of such damages.
-
-   9. Accepting Warranty or Additional Liability. While redistributing
-      the Work or Derivative Works thereof, You may choose to offer,
-      and charge a fee for, acceptance of support, warranty, indemnity,
-      or other liability obligations and/or rights consistent with this
-      License. However, in accepting such obligations, You may act only
-      on Your own behalf and on Your sole responsibility, not on behalf
-      of any other Contributor, and only if You agree to indemnify,
-      defend, and hold each Contributor harmless for any liability
-      incurred by, or claims asserted against, such Contributor by reason
-      of your accepting any such warranty or additional liability.
-
-   END OF TERMS AND CONDITIONS
-
-   APPENDIX: How to apply the Apache License to your work.
-
-      To apply the Apache License to your work, attach the following
-      boilerplate notice, with the fields enclosed by brackets "{}"
-      replaced with your own identifying information. (Don't include
-      the brackets!)  The text should be enclosed in the appropriate
-      comment syntax for the file format. We also recommend that a
-      file or class name and description of purpose be included on the
-      same "printed page" as the copyright notice for easier
-      identification within third-party archives.
-
-   Copyright {yyyy} {name of copyright owner}
-
-   Licensed under the Apache License, Version 2.0 (the "License");
-   you may not use this file except in compliance with the License.
-   You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveStateFactory.java
----------------------------------------------------------------------
diff --git a/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveStateFactory.java b/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveStateFactory.java
index 8f3b9e9..982ce03 100644
--- a/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveStateFactory.java
+++ b/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveStateFactory.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.hive.trident;
 
 import backtype.storm.task.IMetricsContext;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveUpdater.java
----------------------------------------------------------------------
diff --git a/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveUpdater.java b/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveUpdater.java
index b0b32f1..f4c2a9a 100644
--- a/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveUpdater.java
+++ b/external/storm-hive/src/main/java/org/apache/storm/hive/trident/HiveUpdater.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.hive.trident;
 
 import storm.trident.operation.TridentCollector;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-jdbc/LICENSE
----------------------------------------------------------------------
diff --git a/external/storm-jdbc/LICENSE b/external/storm-jdbc/LICENSE
deleted file mode 100644
index e06d208..0000000
--- a/external/storm-jdbc/LICENSE
+++ /dev/null
@@ -1,202 +0,0 @@
-Apache License
-                           Version 2.0, January 2004
-                        http://www.apache.org/licenses/
-
-   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-   1. Definitions.
-
-      "License" shall mean the terms and conditions for use, reproduction,
-      and distribution as defined by Sections 1 through 9 of this document.
-
-      "Licensor" shall mean the copyright owner or entity authorized by
-      the copyright owner that is granting the License.
-
-      "Legal Entity" shall mean the union of the acting entity and all
-      other entities that control, are controlled by, or are under common
-      control with that entity. For the purposes of this definition,
-      "control" means (i) the power, direct or indirect, to cause the
-      direction or management of such entity, whether by contract or
-      otherwise, or (ii) ownership of fifty percent (50%) or more of the
-      outstanding shares, or (iii) beneficial ownership of such entity.
-
-      "You" (or "Your") shall mean an individual or Legal Entity
-      exercising permissions granted by this License.
-
-      "Source" form shall mean the preferred form for making modifications,
-      including but not limited to software source code, documentation
-      source, and configuration files.
-
-      "Object" form shall mean any form resulting from mechanical
-      transformation or translation of a Source form, including but
-      not limited to compiled object code, generated documentation,
-      and conversions to other media types.
-
-      "Work" shall mean the work of authorship, whether in Source or
-      Object form, made available under the License, as indicated by a
-      copyright notice that is included in or attached to the work
-      (an example is provided in the Appendix below).
-
-      "Derivative Works" shall mean any work, whether in Source or Object
-      form, that is based on (or derived from) the Work and for which the
-      editorial revisions, annotations, elaborations, or other modifications
-      represent, as a whole, an original work of authorship. For the purposes
-      of this License, Derivative Works shall not include works that remain
-      separable from, or merely link (or bind by name) to the interfaces of,
-      the Work and Derivative Works thereof.
-
-      "Contribution" shall mean any work of authorship, including
-      the original version of the Work and any modifications or additions
-      to that Work or Derivative Works thereof, that is intentionally
-      submitted to Licensor for inclusion in the Work by the copyright owner
-      or by an individual or Legal Entity authorized to submit on behalf of
-      the copyright owner. For the purposes of this definition, "submitted"
-      means any form of electronic, verbal, or written communication sent
-      to the Licensor or its representatives, including but not limited to
-      communication on electronic mailing lists, source code control systems,
-      and issue tracking systems that are managed by, or on behalf of, the
-      Licensor for the purpose of discussing and improving the Work, but
-      excluding communication that is conspicuously marked or otherwise
-      designated in writing by the copyright owner as "Not a Contribution."
-
-      "Contributor" shall mean Licensor and any individual or Legal Entity
-      on behalf of whom a Contribution has been received by Licensor and
-      subsequently incorporated within the Work.
-
-   2. Grant of Copyright License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      copyright license to reproduce, prepare Derivative Works of,
-      publicly display, publicly perform, sublicense, and distribute the
-      Work and such Derivative Works in Source or Object form.
-
-   3. Grant of Patent License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      (except as stated in this section) patent license to make, have made,
-      use, offer to sell, sell, import, and otherwise transfer the Work,
-      where such license applies only to those patent claims licensable
-      by such Contributor that are necessarily infringed by their
-      Contribution(s) alone or by combination of their Contribution(s)
-      with the Work to which such Contribution(s) was submitted. If You
-      institute patent litigation against any entity (including a
-      cross-claim or counterclaim in a lawsuit) alleging that the Work
-      or a Contribution incorporated within the Work constitutes direct
-      or contributory patent infringement, then any patent licenses
-      granted to You under this License for that Work shall terminate
-      as of the date such litigation is filed.
-
-   4. Redistribution. You may reproduce and distribute copies of the
-      Work or Derivative Works thereof in any medium, with or without
-      modifications, and in Source or Object form, provided that You
-      meet the following conditions:
-
-      (a) You must give any other recipients of the Work or
-          Derivative Works a copy of this License; and
-
-      (b) You must cause any modified files to carry prominent notices
-          stating that You changed the files; and
-
-      (c) You must retain, in the Source form of any Derivative Works
-          that You distribute, all copyright, patent, trademark, and
-          attribution notices from the Source form of the Work,
-          excluding those notices that do not pertain to any part of
-          the Derivative Works; and
-
-      (d) If the Work includes a "NOTICE" text file as part of its
-          distribution, then any Derivative Works that You distribute must
-          include a readable copy of the attribution notices contained
-          within such NOTICE file, excluding those notices that do not
-          pertain to any part of the Derivative Works, in at least one
-          of the following places: within a NOTICE text file distributed
-          as part of the Derivative Works; within the Source form or
-          documentation, if provided along with the Derivative Works; or,
-          within a display generated by the Derivative Works, if and
-          wherever such third-party notices normally appear. The contents
-          of the NOTICE file are for informational purposes only and
-          do not modify the License. You may add Your own attribution
-          notices within Derivative Works that You distribute, alongside
-          or as an addendum to the NOTICE text from the Work, provided
-          that such additional attribution notices cannot be construed
-          as modifying the License.
-
-      You may add Your own copyright statement to Your modifications and
-      may provide additional or different license terms and conditions
-      for use, reproduction, or distribution of Your modifications, or
-      for any such Derivative Works as a whole, provided Your use,
-      reproduction, and distribution of the Work otherwise complies with
-      the conditions stated in this License.
-
-   5. Submission of Contributions. Unless You explicitly state otherwise,
-      any Contribution intentionally submitted for inclusion in the Work
-      by You to the Licensor shall be under the terms and conditions of
-      this License, without any additional terms or conditions.
-      Notwithstanding the above, nothing herein shall supersede or modify
-      the terms of any separate license agreement you may have executed
-      with Licensor regarding such Contributions.
-
-   6. Trademarks. This License does not grant permission to use the trade
-      names, trademarks, service marks, or product names of the Licensor,
-      except as required for reasonable and customary use in describing the
-      origin of the Work and reproducing the content of the NOTICE file.
-
-   7. Disclaimer of Warranty. Unless required by applicable law or
-      agreed to in writing, Licensor provides the Work (and each
-      Contributor provides its Contributions) on an "AS IS" BASIS,
-      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
-      implied, including, without limitation, any warranties or conditions
-      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
-      PARTICULAR PURPOSE. You are solely responsible for determining the
-      appropriateness of using or redistributing the Work and assume any
-      risks associated with Your exercise of permissions under this License.
-
-   8. Limitation of Liability. In no event and under no legal theory,
-      whether in tort (including negligence), contract, or otherwise,
-      unless required by applicable law (such as deliberate and grossly
-      negligent acts) or agreed to in writing, shall any Contributor be
-      liable to You for damages, including any direct, indirect, special,
-      incidental, or consequential damages of any character arising as a
-      result of this License or out of the use or inability to use the
-      Work (including but not limited to damages for loss of goodwill,
-      work stoppage, computer failure or malfunction, or any and all
-      other commercial damages or losses), even if such Contributor
-      has been advised of the possibility of such damages.
-
-   9. Accepting Warranty or Additional Liability. While redistributing
-      the Work or Derivative Works thereof, You may choose to offer,
-      and charge a fee for, acceptance of support, warranty, indemnity,
-      or other liability obligations and/or rights consistent with this
-      License. However, in accepting such obligations, You may act only
-      on Your own behalf and on Your sole responsibility, not on behalf
-      of any other Contributor, and only if You agree to indemnify,
-      defend, and hold each Contributor harmless for any liability
-      incurred by, or claims asserted against, such Contributor by reason
-      of your accepting any such warranty or additional liability.
-
-   END OF TERMS AND CONDITIONS
-
-   APPENDIX: How to apply the Apache License to your work.
-
-      To apply the Apache License to your work, attach the following
-      boilerplate notice, with the fields enclosed by brackets "{}"
-      replaced with your own identifying information. (Don't include
-      the brackets!)  The text should be enclosed in the appropriate
-      comment syntax for the file format. We also recommend that a
-      file or class name and description of purpose be included on the
-      same "printed page" as the copyright notice for easier
-      identification within third-party archives.
-
-   Copyright {yyyy} {name of copyright owner}
-
-   Licensed under the Apache License, Version 2.0 (the "License");
-   you may not use this file except in compliance with the License.
-   You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/ConnectionProvider.java
----------------------------------------------------------------------
diff --git a/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/ConnectionProvider.java b/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/ConnectionProvider.java
index b838e48..cdd8b6e 100644
--- a/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/ConnectionProvider.java
+++ b/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/ConnectionProvider.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.jdbc.common;
 
 import java.io.Serializable;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/HikariCPConnectionProvider.java
----------------------------------------------------------------------
diff --git a/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/HikariCPConnectionProvider.java b/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/HikariCPConnectionProvider.java
index b523fcc..f11d14c 100644
--- a/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/HikariCPConnectionProvider.java
+++ b/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/common/HikariCPConnectionProvider.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.jdbc.common;
 
 import com.zaxxer.hikari.HikariConfig;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/JdbcLookupMapper.java
----------------------------------------------------------------------
diff --git a/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/JdbcLookupMapper.java b/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/JdbcLookupMapper.java
index f8c79a3..0660a4c 100644
--- a/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/JdbcLookupMapper.java
+++ b/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/JdbcLookupMapper.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.jdbc.mapper;
 
 import backtype.storm.topology.OutputFieldsDeclarer;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/SimpleJdbcLookupMapper.java
----------------------------------------------------------------------
diff --git a/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/SimpleJdbcLookupMapper.java b/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/SimpleJdbcLookupMapper.java
index dca1f77..5a22552 100644
--- a/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/SimpleJdbcLookupMapper.java
+++ b/external/storm-jdbc/src/main/java/org/apache/storm/jdbc/mapper/SimpleJdbcLookupMapper.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.jdbc.mapper;
 
 

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-jdbc/src/test/sql/test.sql
----------------------------------------------------------------------
diff --git a/external/storm-jdbc/src/test/sql/test.sql b/external/storm-jdbc/src/test/sql/test.sql
index a402a68..d3ef65b 100644
--- a/external/storm-jdbc/src/test/sql/test.sql
+++ b/external/storm-jdbc/src/test/sql/test.sql
@@ -1 +1,18 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 create table user_details (id integer, user_name varchar(100), create_date date);
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-kafka/src/test/storm/kafka/ExponentialBackoffMsgRetryManagerTest.java
----------------------------------------------------------------------
diff --git a/external/storm-kafka/src/test/storm/kafka/ExponentialBackoffMsgRetryManagerTest.java b/external/storm-kafka/src/test/storm/kafka/ExponentialBackoffMsgRetryManagerTest.java
index ef30163..3dd8b38 100644
--- a/external/storm-kafka/src/test/storm/kafka/ExponentialBackoffMsgRetryManagerTest.java
+++ b/external/storm-kafka/src/test/storm/kafka/ExponentialBackoffMsgRetryManagerTest.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package storm.kafka;
 
 import static org.junit.Assert.assertEquals;

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-redis/LICENSE
----------------------------------------------------------------------
diff --git a/external/storm-redis/LICENSE b/external/storm-redis/LICENSE
deleted file mode 100644
index e06d208..0000000
--- a/external/storm-redis/LICENSE
+++ /dev/null
@@ -1,202 +0,0 @@
-Apache License
-                           Version 2.0, January 2004
-                        http://www.apache.org/licenses/
-
-   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-   1. Definitions.
-
-      "License" shall mean the terms and conditions for use, reproduction,
-      and distribution as defined by Sections 1 through 9 of this document.
-
-      "Licensor" shall mean the copyright owner or entity authorized by
-      the copyright owner that is granting the License.
-
-      "Legal Entity" shall mean the union of the acting entity and all
-      other entities that control, are controlled by, or are under common
-      control with that entity. For the purposes of this definition,
-      "control" means (i) the power, direct or indirect, to cause the
-      direction or management of such entity, whether by contract or
-      otherwise, or (ii) ownership of fifty percent (50%) or more of the
-      outstanding shares, or (iii) beneficial ownership of such entity.
-
-      "You" (or "Your") shall mean an individual or Legal Entity
-      exercising permissions granted by this License.
-
-      "Source" form shall mean the preferred form for making modifications,
-      including but not limited to software source code, documentation
-      source, and configuration files.
-
-      "Object" form shall mean any form resulting from mechanical
-      transformation or translation of a Source form, including but
-      not limited to compiled object code, generated documentation,
-      and conversions to other media types.
-
-      "Work" shall mean the work of authorship, whether in Source or
-      Object form, made available under the License, as indicated by a
-      copyright notice that is included in or attached to the work
-      (an example is provided in the Appendix below).
-
-      "Derivative Works" shall mean any work, whether in Source or Object
-      form, that is based on (or derived from) the Work and for which the
-      editorial revisions, annotations, elaborations, or other modifications
-      represent, as a whole, an original work of authorship. For the purposes
-      of this License, Derivative Works shall not include works that remain
-      separable from, or merely link (or bind by name) to the interfaces of,
-      the Work and Derivative Works thereof.
-
-      "Contribution" shall mean any work of authorship, including
-      the original version of the Work and any modifications or additions
-      to that Work or Derivative Works thereof, that is intentionally
-      submitted to Licensor for inclusion in the Work by the copyright owner
-      or by an individual or Legal Entity authorized to submit on behalf of
-      the copyright owner. For the purposes of this definition, "submitted"
-      means any form of electronic, verbal, or written communication sent
-      to the Licensor or its representatives, including but not limited to
-      communication on electronic mailing lists, source code control systems,
-      and issue tracking systems that are managed by, or on behalf of, the
-      Licensor for the purpose of discussing and improving the Work, but
-      excluding communication that is conspicuously marked or otherwise
-      designated in writing by the copyright owner as "Not a Contribution."
-
-      "Contributor" shall mean Licensor and any individual or Legal Entity
-      on behalf of whom a Contribution has been received by Licensor and
-      subsequently incorporated within the Work.
-
-   2. Grant of Copyright License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      copyright license to reproduce, prepare Derivative Works of,
-      publicly display, publicly perform, sublicense, and distribute the
-      Work and such Derivative Works in Source or Object form.
-
-   3. Grant of Patent License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      (except as stated in this section) patent license to make, have made,
-      use, offer to sell, sell, import, and otherwise transfer the Work,
-      where such license applies only to those patent claims licensable
-      by such Contributor that are necessarily infringed by their
-      Contribution(s) alone or by combination of their Contribution(s)
-      with the Work to which such Contribution(s) was submitted. If You
-      institute patent litigation against any entity (including a
-      cross-claim or counterclaim in a lawsuit) alleging that the Work
-      or a Contribution incorporated within the Work constitutes direct
-      or contributory patent infringement, then any patent licenses
-      granted to You under this License for that Work shall terminate
-      as of the date such litigation is filed.
-
-   4. Redistribution. You may reproduce and distribute copies of the
-      Work or Derivative Works thereof in any medium, with or without
-      modifications, and in Source or Object form, provided that You
-      meet the following conditions:
-
-      (a) You must give any other recipients of the Work or
-          Derivative Works a copy of this License; and
-
-      (b) You must cause any modified files to carry prominent notices
-          stating that You changed the files; and
-
-      (c) You must retain, in the Source form of any Derivative Works
-          that You distribute, all copyright, patent, trademark, and
-          attribution notices from the Source form of the Work,
-          excluding those notices that do not pertain to any part of
-          the Derivative Works; and
-
-      (d) If the Work includes a "NOTICE" text file as part of its
-          distribution, then any Derivative Works that You distribute must
-          include a readable copy of the attribution notices contained
-          within such NOTICE file, excluding those notices that do not
-          pertain to any part of the Derivative Works, in at least one
-          of the following places: within a NOTICE text file distributed
-          as part of the Derivative Works; within the Source form or
-          documentation, if provided along with the Derivative Works; or,
-          within a display generated by the Derivative Works, if and
-          wherever such third-party notices normally appear. The contents
-          of the NOTICE file are for informational purposes only and
-          do not modify the License. You may add Your own attribution
-          notices within Derivative Works that You distribute, alongside
-          or as an addendum to the NOTICE text from the Work, provided
-          that such additional attribution notices cannot be construed
-          as modifying the License.
-
-      You may add Your own copyright statement to Your modifications and
-      may provide additional or different license terms and conditions
-      for use, reproduction, or distribution of Your modifications, or
-      for any such Derivative Works as a whole, provided Your use,
-      reproduction, and distribution of the Work otherwise complies with
-      the conditions stated in this License.
-
-   5. Submission of Contributions. Unless You explicitly state otherwise,
-      any Contribution intentionally submitted for inclusion in the Work
-      by You to the Licensor shall be under the terms and conditions of
-      this License, without any additional terms or conditions.
-      Notwithstanding the above, nothing herein shall supersede or modify
-      the terms of any separate license agreement you may have executed
-      with Licensor regarding such Contributions.
-
-   6. Trademarks. This License does not grant permission to use the trade
-      names, trademarks, service marks, or product names of the Licensor,
-      except as required for reasonable and customary use in describing the
-      origin of the Work and reproducing the content of the NOTICE file.
-
-   7. Disclaimer of Warranty. Unless required by applicable law or
-      agreed to in writing, Licensor provides the Work (and each
-      Contributor provides its Contributions) on an "AS IS" BASIS,
-      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
-      implied, including, without limitation, any warranties or conditions
-      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
-      PARTICULAR PURPOSE. You are solely responsible for determining the
-      appropriateness of using or redistributing the Work and assume any
-      risks associated with Your exercise of permissions under this License.
-
-   8. Limitation of Liability. In no event and under no legal theory,
-      whether in tort (including negligence), contract, or otherwise,
-      unless required by applicable law (such as deliberate and grossly
-      negligent acts) or agreed to in writing, shall any Contributor be
-      liable to You for damages, including any direct, indirect, special,
-      incidental, or consequential damages of any character arising as a
-      result of this License or out of the use or inability to use the
-      Work (including but not limited to damages for loss of goodwill,
-      work stoppage, computer failure or malfunction, or any and all
-      other commercial damages or losses), even if such Contributor
-      has been advised of the possibility of such damages.
-
-   9. Accepting Warranty or Additional Liability. While redistributing
-      the Work or Derivative Works thereof, You may choose to offer,
-      and charge a fee for, acceptance of support, warranty, indemnity,
-      or other liability obligations and/or rights consistent with this
-      License. However, in accepting such obligations, You may act only
-      on Your own behalf and on Your sole responsibility, not on behalf
-      of any other Contributor, and only if You agree to indemnify,
-      defend, and hold each Contributor harmless for any liability
-      incurred by, or claims asserted against, such Contributor by reason
-      of your accepting any such warranty or additional liability.
-
-   END OF TERMS AND CONDITIONS
-
-   APPENDIX: How to apply the Apache License to your work.
-
-      To apply the Apache License to your work, attach the following
-      boilerplate notice, with the fields enclosed by brackets "{}"
-      replaced with your own identifying information. (Don't include
-      the brackets!)  The text should be enclosed in the appropriate
-      comment syntax for the file format. We also recommend that a
-      file or class name and description of purpose be included on the
-      same "printed page" as the copyright notice for easier
-      identification within third-party archives.
-
-   Copyright {yyyy} {name of copyright owner}
-
-   Licensed under the Apache License, Version 2.0 (the "License");
-   you may not use this file except in compliance with the License.
-   You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-

http://git-wip-us.apache.org/repos/asf/storm/blob/2154048f/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountLookupMapper.java
----------------------------------------------------------------------
diff --git a/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountLookupMapper.java b/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountLookupMapper.java
index 5c67c8c..a445749 100644
--- a/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountLookupMapper.java
+++ b/external/storm-redis/src/test/java/org/apache/storm/redis/trident/WordCountLookupMapper.java
@@ -1,3 +1,20 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.redis.trident;
 
 import backtype.storm.topology.OutputFieldsDeclarer;


[09/50] [abbrv] storm git commit: link examples to source

Posted by pt...@apache.org.
link examples to source


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/a791604a
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/a791604a
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/a791604a

Branch: refs/heads/0.10.x-branch
Commit: a791604ac5d274cededae8a01362dac4dfd82d84
Parents: 0c1e0aa
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue Apr 7 00:07:49 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue Apr 7 00:07:49 2015 -0400

----------------------------------------------------------------------
 .../src/test/resources/configs/hdfs_test.yaml   | 97 ++++++++++++++++++++
 flux-examples/README.md                         |  8 +-
 2 files changed, 101 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/a791604a/flux-core/src/test/resources/configs/hdfs_test.yaml
----------------------------------------------------------------------
diff --git a/flux-core/src/test/resources/configs/hdfs_test.yaml b/flux-core/src/test/resources/configs/hdfs_test.yaml
new file mode 100644
index 0000000..8fe0a9a
--- /dev/null
+++ b/flux-core/src/test/resources/configs/hdfs_test.yaml
@@ -0,0 +1,97 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Test ability to wire together shell spouts/bolts
+---
+
+# topology definition
+# name to be used when submitting
+name: "hdfs-topology"
+
+# Components
+# Components are analagous to Spring beans. They are meant to be used as constructor,
+# property(setter), and builder arguments.
+#
+# for the time being, components must be declared in the order they are referenced
+components:
+  - id: "syncPolicy"
+    className: "org.apache.storm.hdfs.bolt.sync.CountSyncPolicy"
+    constructorArgs:
+      - 1000
+  - id: "rotationPolicy"
+    className: "org.apache.storm.hdfs.bolt.rotation.FileSizeRotationPolicy"
+    constructorArgs:
+      - 5.0
+      - MB
+
+  - id: "fileNameFormat"
+    className: "org.apache.storm.hdfs.bolt.format.DefaultFileNameFormat"
+    configMethods:
+      - name: "withPath"
+        args: ["/tmp/foo/"]
+      - name: "withExtension"
+        args: [".txt"]
+
+  - id: "recordFormat"
+    className: "org.apache.storm.hdfs.bolt.format.DelimitedRecordFormat"
+    configMethods:
+      - name: "withFieldDelimiter"
+        args: ["|"]
+
+  - id: "rotationAction"
+    className: "org.apache.storm.hdfs.common.rotation.MoveFileAction"
+    configMethods:
+      - name: "toDestination"
+        args: ["/tmp/dest2"]
+
+# spout definitions
+spouts:
+  - id: "spout-1"
+    className: "backtype.storm.testing.TestWordSpout"
+    parallelism: 1
+    # ...
+
+# bolt definitions
+
+#        HdfsBolt bolt = new HdfsBolt()
+#                .withConfigKey("hdfs.config")
+#                .withFsUrl(args[0])
+#                .withFileNameFormat(fileNameFormat)
+#                .withRecordFormat(format)
+#                .withRotationPolicy(rotationPolicy)
+#                .withSyncPolicy(syncPolicy)
+#                .addRotationAction(new MoveFileAction().toDestination("/tmp/dest2/"));
+bolts:
+  - id: "bolt-1"
+    className: "org.apache.storm.hdfs.bolt.HdfsBolt"
+    configMethods:
+      - name: "withConfigKey"
+        args: ["hdfs.config"]
+      - name: "withFsUrl"
+        args: ["hdfs://hadoop:54310"]
+      - name: "withFileNameFormat"
+        args: [ref: "fileNameFormat"]
+      - name: "withRecordFormat"
+        args: [ref: "recordFormat"]
+      - name: "withRotationPolicy"
+        args: [ref: "rotationPolicy"]
+      - name: "withSyncPolicy"
+        args: [ref: "syncPolicy"]
+      - name: "addRotationAction"
+        args: [ref: "rotationAction"]
+    parallelism: 1
+    # ...
+

http://git-wip-us.apache.org/repos/asf/storm/blob/a791604a/flux-examples/README.md
----------------------------------------------------------------------
diff --git a/flux-examples/README.md b/flux-examples/README.md
index 9f5682e..425ad98 100644
--- a/flux-examples/README.md
+++ b/flux-examples/README.md
@@ -28,21 +28,21 @@ storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux -
 
 ## Available Examples
 
-### simple_wordcount.yaml
+### [simple_wordcount.yaml](src/main/resources/simple_wordcount.yaml)
 
 This is a very basic wordcount example using Java spouts and bolts. It simply logs the running count of each word
 received.
 
-### multilang.yaml
+### [multilang.yaml](src/main/resources/multilang.yaml)
 
 Another wordcount example that uses a spout written in JavaScript (node.js), a bolt written in Python, and two bolts
 written in java.
 
-### kafka_spout.yaml
+### [kafka_spout.yaml](src/main/resources/kafka_spout.yaml)
 This example illustrates how to configure Storm's `storm-kafka` spout using Flux YAML DSL `components`, `references`,
 and `constructor arguments` constructs.
 
-### simple_hdfs.yaml
+### [simple_hdfs.yaml](src/main/resources/simple_hdfs.yaml)
 
 This example demonstrates using Flux to setup a storm-hdfs bolt to write to an HDFS cluster. It also demonstrates Flux's
 variable substitution/filtering feature.


[19/50] [abbrv] storm git commit: add missing license headers

Posted by pt...@apache.org.
add missing license headers


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/01a66bf5
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/01a66bf5
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/01a66bf5

Branch: refs/heads/0.10.x-branch
Commit: 01a66bf5b17a9fce94fcdac4fbaa2f28c2df9189
Parents: c48f63e
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue May 5 16:14:33 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue May 5 16:14:33 2015 -0400

----------------------------------------------------------------------
 .../apache/storm/flux/api/TopologySource.java   |  17 +++
 .../storm/flux/model/ConfigMethodDef.java       |  17 +++
 .../storm/flux/model/TopologySourceDef.java     |  17 +++
 .../java/org/apache/storm/flux/FrankenBean.java | 138 -------------------
 .../org/apache/storm/flux/IntegrationTest.java  |  17 +++
 .../src/main/resources/hbase_bolt.properties    |  16 +++
 .../src/main/resources/hdfs_bolt.properties     |  17 +++
 .../storm/flux/wrappers/bolts/LogInfoBolt.java  |  18 +++
 8 files changed, 119 insertions(+), 138 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/01a66bf5/flux-core/src/main/java/org/apache/storm/flux/api/TopologySource.java
----------------------------------------------------------------------
diff --git a/flux-core/src/main/java/org/apache/storm/flux/api/TopologySource.java b/flux-core/src/main/java/org/apache/storm/flux/api/TopologySource.java
index 779e676..fbccfb7 100644
--- a/flux-core/src/main/java/org/apache/storm/flux/api/TopologySource.java
+++ b/flux-core/src/main/java/org/apache/storm/flux/api/TopologySource.java
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.flux.api;
 
 

http://git-wip-us.apache.org/repos/asf/storm/blob/01a66bf5/flux-core/src/main/java/org/apache/storm/flux/model/ConfigMethodDef.java
----------------------------------------------------------------------
diff --git a/flux-core/src/main/java/org/apache/storm/flux/model/ConfigMethodDef.java b/flux-core/src/main/java/org/apache/storm/flux/model/ConfigMethodDef.java
index e274915..6f7e4d4 100644
--- a/flux-core/src/main/java/org/apache/storm/flux/model/ConfigMethodDef.java
+++ b/flux-core/src/main/java/org/apache/storm/flux/model/ConfigMethodDef.java
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.flux.model;
 
 import java.util.ArrayList;

http://git-wip-us.apache.org/repos/asf/storm/blob/01a66bf5/flux-core/src/main/java/org/apache/storm/flux/model/TopologySourceDef.java
----------------------------------------------------------------------
diff --git a/flux-core/src/main/java/org/apache/storm/flux/model/TopologySourceDef.java b/flux-core/src/main/java/org/apache/storm/flux/model/TopologySourceDef.java
index 2949659..d6a2f57 100644
--- a/flux-core/src/main/java/org/apache/storm/flux/model/TopologySourceDef.java
+++ b/flux-core/src/main/java/org/apache/storm/flux/model/TopologySourceDef.java
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.flux.model;
 
 public class TopologySourceDef extends ObjectDef {

http://git-wip-us.apache.org/repos/asf/storm/blob/01a66bf5/flux-core/src/test/java/org/apache/storm/flux/FrankenBean.java
----------------------------------------------------------------------
diff --git a/flux-core/src/test/java/org/apache/storm/flux/FrankenBean.java b/flux-core/src/test/java/org/apache/storm/flux/FrankenBean.java
deleted file mode 100644
index 80a3c90..0000000
--- a/flux-core/src/test/java/org/apache/storm/flux/FrankenBean.java
+++ /dev/null
@@ -1,138 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.storm.flux;
-
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.lang.reflect.Constructor;
-import java.lang.reflect.Field;
-import java.lang.reflect.Method;
-import java.util.ArrayList;
-import java.util.HashMap;
-
-/**
- * Test class that is a hybrid java bean -- it has both standard
- * java bean setters, as well as public instance variables.
- */
-public class FrankenBean {
-    private static final Logger LOG = LoggerFactory.getLogger(FrankenBean.class);
-
-    public String publicString;
-    public boolean publicBoolean;
-
-    private String privateString;
-    private boolean privateBoolean;
-
-    public void setPrivateBoolean(boolean b){
-        this.privateBoolean = b;
-    }
-
-    public void setPrivateString(String string){
-        this.privateString = string;
-    }
-
-    static class Test {
-        public Test(int i){
-            System.out.println("Constructor: " + i);
-        }
-    }
-
-    public static void main(String[] args) throws Exception{
-        Integer i = new Integer(1);
-
-        Class clazz = Test.class;
-
-        Constructor ctor = clazz.getConstructor(new Class[]{int.class});
-        ctor.newInstance(i);
-
-        System.out.println("isNumber: " + Number.class.isAssignableFrom(i.getClass()));
-    }
-
-    public static void main2(String[] args) throws Exception {
-        Class clazz = Class.forName("org.apache.storm.flux.FrankenBean");
-        HashMap<String, Object> props = new HashMap<String, Object>();
-        props.put("publicString", "foo");
-        props.put("privateString", "bar");
-
-        props.put("privateBoolean", true);
-        props.put("publicBoolean", true);
-
-        props.put("notgonnafindit", "foobar");
-
-        // only support default constructors for now
-        Object instance = clazz.newInstance();
-
-        for(String key : props.keySet()){
-            Method setter = findSetter(clazz, key, props.get(key));
-            if(setter != null){
-                // invoke setter
-                setter.invoke(instance, new Object[]{props.get(key)});
-            } else {
-                // look for a public instance variable
-                Field field = findPublicField(clazz, key, props.get(key));
-                if(field != null) {
-                    field.set(instance, props.get(key));
-                }
-            }
-        }
-
-        LOG.info("Bean: {}", instance);
-
-    }
-
-
-    public static Field findPublicField(Class clazz, String property, Object arg){
-        Field field = null;
-        try{
-            field = clazz.getField(property);
-        } catch (NoSuchFieldException e){
-            LOG.warn("Could not find setter or public variable for property: " + property, e);
-        }
-        return field;
-    }
-
-    public static Method findSetter(Class clazz, String property, Object arg){
-        String setterName = toSetterName(property);
-
-//        ArrayList<Method> candidates = new ArrayList<Method>();
-        Method retval = null;
-        Method[] methods = clazz.getMethods();
-        for(Method method : methods){
-            if(setterName.equals(method.getName())) {
-                LOG.info("Found setter method: " + method.getName());
-                retval = method;
-            }
-        }
-        return retval;
-    }
-
-    public static String toSetterName(String name){
-        return "set" + name.substring(0,1).toUpperCase() + name.substring(1, name.length());
-    }
-
-    public String toString(){
-        return String.format("publicString: %s, privateString: %s, " +
-                "publicBoolean: %s, privateBoolean: %s",
-                this.publicString,
-                this.privateString,
-                this.publicBoolean,
-                this.privateBoolean);
-    }
-
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/01a66bf5/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
----------------------------------------------------------------------
diff --git a/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java b/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
index 2dea72a..5e17f5e 100644
--- a/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
+++ b/flux-core/src/test/java/org/apache/storm/flux/IntegrationTest.java
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.storm.flux;
 
 import org.junit.Test;

http://git-wip-us.apache.org/repos/asf/storm/blob/01a66bf5/flux-examples/src/main/resources/hbase_bolt.properties
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/hbase_bolt.properties b/flux-examples/src/main/resources/hbase_bolt.properties
index 9903b41..f8ed50c 100644
--- a/flux-examples/src/main/resources/hbase_bolt.properties
+++ b/flux-examples/src/main/resources/hbase_bolt.properties
@@ -1,2 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
 hbase.rootdir=hdfs://hadoop:54310/hbase
 hbase.zookeeper.quorum=hadoop
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/01a66bf5/flux-examples/src/main/resources/hdfs_bolt.properties
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/hdfs_bolt.properties b/flux-examples/src/main/resources/hdfs_bolt.properties
index dd1307d..7bcbe7a 100644
--- a/flux-examples/src/main/resources/hdfs_bolt.properties
+++ b/flux-examples/src/main/resources/hdfs_bolt.properties
@@ -1,3 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
 # The HDFS url
 hdfs.url=hdfs://hadoop:54310
 

http://git-wip-us.apache.org/repos/asf/storm/blob/01a66bf5/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/LogInfoBolt.java
----------------------------------------------------------------------
diff --git a/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/LogInfoBolt.java b/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/LogInfoBolt.java
index 5f91909..a42d7c3 100644
--- a/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/LogInfoBolt.java
+++ b/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/LogInfoBolt.java
@@ -1,3 +1,21 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
 package org.apache.storm.flux.wrappers.bolts;
 
 import backtype.storm.topology.BasicOutputCollector;


[13/50] [abbrv] storm git commit: improve error logging

Posted by pt...@apache.org.
improve error logging


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/01702dc5
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/01702dc5
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/01702dc5

Branch: refs/heads/0.10.x-branch
Commit: 01702dc503e4f5fbcc499dcae48bb3ffad1229db
Parents: ae305c7
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Apr 8 01:10:30 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Apr 8 01:10:30 2015 -0400

----------------------------------------------------------------------
 .../java/org/apache/storm/flux/FluxBuilder.java |  9 +-
 .../java/org/apache/storm/flux/TCKTest.java     | 10 ++
 .../src/test/resources/configs/bad_hbase.yaml   | 98 ++++++++++++++++++++
 3 files changed, 114 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/01702dc5/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
----------------------------------------------------------------------
diff --git a/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java b/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
index 964c62e..57237b6 100644
--- a/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
+++ b/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
@@ -286,7 +286,10 @@ public class FluxBuilder {
                 LOG.debug("Found something seemingly compatible, attempting invocation...");
                 obj = con.newInstance(getArgsWithListCoercian(cArgs, con.getParameterTypes()));
             } else {
-                throw new IllegalArgumentException("Couldn't find a suitable constructor.");
+                String msg = String.format("Couldn't find a suitable constructor for class '%s' with arguments '%s'.",
+                        clazz.getName(),
+                        cArgs);
+                throw new IllegalArgumentException(msg);
             }
         } else {
             obj = clazz.newInstance();
@@ -419,9 +422,9 @@ public class FluxBuilder {
                 Object[] methodArgs = getArgsWithListCoercian(args, method.getParameterTypes());
                 method.invoke(instance, methodArgs);
             } else {
-                LOG.warn("Unable to find method '{}' in class '{}' with arguments {}.",
+                String msg = String.format("Unable to find configuration method '%s' in class '%s' with arguments %s.",
                         new Object[]{methodName, clazz.getName(), args});
-                throw new IllegalArgumentException("Configuration method not found.");
+                throw new IllegalArgumentException(msg);
             }
         }
     }

http://git-wip-us.apache.org/repos/asf/storm/blob/01702dc5/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
----------------------------------------------------------------------
diff --git a/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java b/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
index 27abfbe..9456d1b 100644
--- a/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
+++ b/flux-core/src/test/java/org/apache/storm/flux/TCKTest.java
@@ -91,6 +91,16 @@ public class TCKTest {
         topology.validate();
     }
 
+    @Test(expected = IllegalArgumentException.class)
+    public void testBadHbase() throws Exception {
+        TopologyDef topologyDef = FluxParser.parseResource("/configs/bad_hbase.yaml", false, true, null, false);
+        Config conf = FluxBuilder.buildConfig(topologyDef);
+        ExecutionContext context = new ExecutionContext(topologyDef, conf);
+        StormTopology topology = FluxBuilder.buildTopology(context);
+        assertNotNull(topology);
+        topology.validate();
+    }
+
     @Test
     public void testIncludes() throws Exception {
         TopologyDef topologyDef = FluxParser.parseResource("/configs/include_test.yaml", false, true, null, false);

http://git-wip-us.apache.org/repos/asf/storm/blob/01702dc5/flux-core/src/test/resources/configs/bad_hbase.yaml
----------------------------------------------------------------------
diff --git a/flux-core/src/test/resources/configs/bad_hbase.yaml b/flux-core/src/test/resources/configs/bad_hbase.yaml
new file mode 100644
index 0000000..5d91400
--- /dev/null
+++ b/flux-core/src/test/resources/configs/bad_hbase.yaml
@@ -0,0 +1,98 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Test ability to wire together shell spouts/bolts
+---
+
+# topology definition
+# name to be used when submitting
+name: "hbase-wordcount"
+
+# Components
+# Components are analagous to Spring beans. They are meant to be used as constructor,
+# property(setter), and builder arguments.
+#
+# for the time being, components must be declared in the order they are referenced
+
+components:
+  - id: "columnFields"
+    className: "backtype.storm.tuple.Fields"
+    constructorArgs:
+      - ["word"]
+
+  - id: "counterFields"
+    className: "backtype.storm.tuple.Fields"
+    constructorArgs:
+      # !!! the following won't work, and should thow an IllegalArgumentException...
+      - "count"
+
+  - id: "mapper"
+    className: "org.apache.storm.hbase.bolt.mapper.SimpleHBaseMapper"
+    configMethods:
+      - name: "withRowKeyField"
+        args: ["word"]
+      - name: "withColumnFields"
+        args: [ref: "columnFields"]
+      - name: "withCounterFields"
+        args: [ref: "counterFields"]
+      - name: "withColumnFamily"
+        args: ["cf"]
+
+# topology configuration
+# this will be passed to the submitter as a map of config options
+#
+config:
+  topology.workers: 1
+  hbase.conf:
+    hbase.rootdir: "hdfs://hadoop:54310/hbase"
+    hbase.zookeeper.quorum: "hadoop"
+
+# spout definitions
+spouts:
+  - id: "word-spout"
+    className: "backtype.storm.testing.TestWordSpout"
+    parallelism: 1
+
+# bolt definitions
+
+bolts:
+  - id: "count-bolt"
+    className: "backtype.storm.testing.TestWordCounter"
+
+  - id: "hbase-bolt"
+    className: "org.apache.storm.hbase.bolt.HBaseBolt"
+    constructorArgs:
+      - "WordCount" # HBase table name
+      - ref: "mapper"
+    configMethods:
+      - name: "withConfigKey"
+        args: ["hbase.conf"]
+    parallelism: 1
+
+
+streams:
+  - name: "" # name isn't used (placeholder for logging, UI, etc.)
+    from: "word-spout"
+    to: "count-bolt"
+    grouping:
+      type: SHUFFLE
+
+  - name: "" # name isn't used (placeholder for logging, UI, etc.)
+    from: "count-bolt"
+    to: "hbase-bolt"
+    grouping:
+      type: FIELDS
+      args: ["word"]
\ No newline at end of file


[15/50] [abbrv] storm git commit: set default parallelism of spouts/bolts to 1

Posted by pt...@apache.org.
set default parallelism of spouts/bolts to 1


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/abf2924e
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/abf2924e
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/abf2924e

Branch: refs/heads/0.10.x-branch
Commit: abf2924e8a89d9b9fc28037a4ce079c607f62939
Parents: 9f5f822
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Fri Apr 10 15:47:16 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Fri Apr 10 15:47:16 2015 -0400

----------------------------------------------------------------------
 .../src/main/java/org/apache/storm/flux/model/VertexDef.java      | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/abf2924e/flux-core/src/main/java/org/apache/storm/flux/model/VertexDef.java
----------------------------------------------------------------------
diff --git a/flux-core/src/main/java/org/apache/storm/flux/model/VertexDef.java b/flux-core/src/main/java/org/apache/storm/flux/model/VertexDef.java
index f8bf607..e71bcc2 100644
--- a/flux-core/src/main/java/org/apache/storm/flux/model/VertexDef.java
+++ b/flux-core/src/main/java/org/apache/storm/flux/model/VertexDef.java
@@ -23,7 +23,8 @@ package org.apache.storm.flux.model;
  */
 public abstract class VertexDef extends BeanDef {
 
-    private int parallelism;
+    // default parallelism to 1 so if it's ommitted, the topology will still function.
+    private int parallelism = 1;
 
     public int getParallelism() {
         return parallelism;


[43/50] [abbrv] storm git commit: add STORM-818 to changelog

Posted by pt...@apache.org.
add STORM-818 to changelog


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/23137753
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/23137753
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/23137753

Branch: refs/heads/0.10.x-branch
Commit: 23137753250b4d8e8e15cfe661a9ba6cdc037273
Parents: 6978b58
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Jun 3 13:08:46 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Jun 3 13:08:46 2015 -0400

----------------------------------------------------------------------
 CHANGELOG.md | 1 +
 1 file changed, 1 insertion(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/23137753/CHANGELOG.md
----------------------------------------------------------------------
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 6329872..02aa95d 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,6 +1,7 @@
 ## 0.11.0
 
 ## 0.10.0
+ * STORM-818: storm-eventhubs configuration improvement and refactoring
  * STORM-842: Drop Support for Java 1.6
  * STORM-835: Netty Client hold batch object until io operation complete
  * STORM-827: Allow AutoTGT to work with storm-hdfs too.


[42/50] [abbrv] storm git commit: Merge branch 'ehimprove' of github.com:shanyu/storm

Posted by pt...@apache.org.
Merge branch 'ehimprove' of github.com:shanyu/storm


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/6978b585
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/6978b585
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/6978b585

Branch: refs/heads/0.10.x-branch
Commit: 6978b58574a4de67ef12ded945e5183d96680618
Parents: a55bbbe 9c2972a
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Jun 3 13:03:48 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Jun 3 13:03:48 2015 -0400

----------------------------------------------------------------------
 external/storm-eventhubs/pom.xml                |  38 ++---
 .../eventhubs/bolt/DefaultEventDataFormat.java  |  47 +++++++
 .../storm/eventhubs/bolt/EventHubBolt.java      |  56 +++++---
 .../eventhubs/bolt/EventHubBoltConfig.java      | 109 +++++++++++++++
 .../storm/eventhubs/bolt/IEventDataFormat.java  |  28 ++++
 .../client/ConnectionStringBuilder.java         | 116 ----------------
 .../storm/eventhubs/client/Constants.java       |  32 -----
 .../storm/eventhubs/client/EventHubClient.java  |  92 ------------
 .../eventhubs/client/EventHubConsumerGroup.java |  72 ----------
 .../eventhubs/client/EventHubException.java     |  37 -----
 .../eventhubs/client/EventHubReceiver.java      | 139 -------------------
 .../eventhubs/client/EventHubSendClient.java    |  70 ----------
 .../storm/eventhubs/client/EventHubSender.java  |  95 -------------
 .../storm/eventhubs/client/SelectorFilter.java  |  38 -----
 .../eventhubs/client/SelectorFilterWriter.java  |  64 ---------
 .../storm/eventhubs/samples/EventCount.java     |   5 +-
 .../storm/eventhubs/samples/EventHubLoop.java   |   9 +-
 .../eventhubs/spout/EventHubReceiverFilter.java |  56 --------
 .../eventhubs/spout/EventHubReceiverImpl.java   |  49 ++++---
 .../storm/eventhubs/spout/EventHubSpout.java    |   5 +
 .../eventhubs/spout/EventHubSpoutConfig.java    | 126 +++++++++--------
 .../eventhubs/spout/IEventHubReceiver.java      |   5 +-
 .../spout/IEventHubReceiverFilter.java          |  35 -----
 .../eventhubs/spout/SimplePartitionManager.java |  11 +-
 .../spout/StaticPartitionCoordinator.java       |   2 +-
 .../TransactionalTridentEventHubEmitter.java    |   2 +-
 .../trident/TridentPartitionManager.java        |  12 +-
 .../src/main/resources/config.properties        |   5 +-
 .../eventhubs/spout/EventHubReceiverMock.java   |  18 +--
 .../eventhubs/spout/TestEventHubSpout.java      |   4 +-
 30 files changed, 376 insertions(+), 1001 deletions(-)
----------------------------------------------------------------------



[50/50] [abbrv] storm git commit: fix version in pom

Posted by pt...@apache.org.
fix version in pom


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/b2642962
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/b2642962
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/b2642962

Branch: refs/heads/0.10.x-branch
Commit: b26429621741a785658d1984d113b15d491052e3
Parents: c3cc4dc
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Jun 3 22:02:44 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Jun 3 22:02:44 2015 -0400

----------------------------------------------------------------------
 storm-core/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/b2642962/storm-core/pom.xml
----------------------------------------------------------------------
diff --git a/storm-core/pom.xml b/storm-core/pom.xml
index 765a1dd..db54481 100644
--- a/storm-core/pom.xml
+++ b/storm-core/pom.xml
@@ -20,7 +20,7 @@
     <parent>
         <artifactId>storm</artifactId>
         <groupId>org.apache.storm</groupId>
-        <version>0.11.0-SNAPSHOT</version>
+        <version>0.10.0-SNAPSHOT</version>
         <relativePath>..</relativePath>
     </parent>
     <groupId>org.apache.storm</groupId>


[14/50] [abbrv] storm git commit: update hbase and hdfs examples

Posted by pt...@apache.org.
update hbase and hdfs examples


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/9f5f8227
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/9f5f8227
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/9f5f8227

Branch: refs/heads/0.10.x-branch
Commit: 9f5f822720544e356682d3c007c3407f5c2e5058
Parents: 01702dc
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Apr 8 12:50:59 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Apr 8 12:50:59 2015 -0400

----------------------------------------------------------------------
 .../storm/flux/examples/WordCountClient.java    | 17 +++++++--
 flux-examples/src/main/resources/hbase-site.xml | 36 --------------------
 .../src/main/resources/hbase_bolt.properties    |  2 ++
 .../src/main/resources/hdfs_bolt.properties     |  6 ++--
 .../src/main/resources/simple_hbase.yaml        |  3 +-
 .../src/main/resources/simple_hdfs.yaml         |  6 ++--
 6 files changed, 24 insertions(+), 46 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/9f5f8227/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java b/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
index 55873d5..eb4fb7a 100644
--- a/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
+++ b/flux-examples/src/main/java/org/apache/storm/flux/examples/WordCountClient.java
@@ -24,6 +24,9 @@ import org.apache.hadoop.hbase.client.HTable;
 import org.apache.hadoop.hbase.client.Result;
 import org.apache.hadoop.hbase.util.Bytes;
 
+import java.io.FileInputStream;
+import java.util.Properties;
+
 /**
  * Connects to the 'WordCount' HBase table and prints counts for each word.
  *
@@ -39,8 +42,17 @@ public class WordCountClient {
 
     public static void main(String[] args) throws Exception {
         Configuration config = HBaseConfiguration.create();
-        if(args.length > 0){
-            config.set("hbase.rootdir", args[0]);
+        if(args.length == 1){
+            Properties props = new Properties();
+            props.load(new FileInputStream(args[0]));
+            System.out.println("HBase configuration:");
+            for(Object key : props.keySet()) {
+                System.out.println(key + "=" + props.get(key));
+                config.set((String)key, props.getProperty((String)key));
+            }
+        } else {
+            System.out.println("Usage: WordCountClient <hbase_config.properties>");
+            System.exit(1);
         }
 
         HTable table = new HTable(config, "WordCount");
@@ -54,7 +66,6 @@ public class WordCountClient {
             byte[] wordBytes = result.getValue(Bytes.toBytes("cf"), Bytes.toBytes("word"));
 
             String wordStr = Bytes.toString(wordBytes);
-            System.out.println(wordStr);
             long count = Bytes.toLong(countBytes);
             System.out.println("Word: '" + wordStr + "', Count: " + count);
         }

http://git-wip-us.apache.org/repos/asf/storm/blob/9f5f8227/flux-examples/src/main/resources/hbase-site.xml
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/hbase-site.xml b/flux-examples/src/main/resources/hbase-site.xml
deleted file mode 100644
index 06c3031..0000000
--- a/flux-examples/src/main/resources/hbase-site.xml
+++ /dev/null
@@ -1,36 +0,0 @@
-<?xml version="1.0"?>
-<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
-<!--
-/**
- *
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
--->
-<configuration>
-	<property>
-	  <name>hbase.cluster.distributed</name>
-	  <value>true</value>
-	</property>
-	<property>
-	  <name>hbase.rootdir</name>
-	  <value>hdfs://hadoop:54310/hbase</value>
-	</property>
-	<property>
-	  <name>hbase.zookeeper.quorum</name>
-	  <value>hadoop</value>
-	</property>
-</configuration>
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/9f5f8227/flux-examples/src/main/resources/hbase_bolt.properties
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/hbase_bolt.properties b/flux-examples/src/main/resources/hbase_bolt.properties
new file mode 100644
index 0000000..9903b41
--- /dev/null
+++ b/flux-examples/src/main/resources/hbase_bolt.properties
@@ -0,0 +1,2 @@
+hbase.rootdir=hdfs://hadoop:54310/hbase
+hbase.zookeeper.quorum=hadoop
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/9f5f8227/flux-examples/src/main/resources/hdfs_bolt.properties
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/hdfs_bolt.properties b/flux-examples/src/main/resources/hdfs_bolt.properties
index 34a7a23..dd1307d 100644
--- a/flux-examples/src/main/resources/hdfs_bolt.properties
+++ b/flux-examples/src/main/resources/hdfs_bolt.properties
@@ -1,9 +1,9 @@
 # The HDFS url
-hdfs.url="hdfs://hadoop:54310"
+hdfs.url=hdfs://hadoop:54310
 
 # The HDFS directory where the bolt will write incoming data
-hdfs.write.dir="/incoming"
+hdfs.write.dir=/incoming
 
 # The HDFS directory where files will be moved once the bolt has
 # finished writing to it.
-hdfs.dest.dir="/complete"
\ No newline at end of file
+hdfs.dest.dir=/complete
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/9f5f8227/flux-examples/src/main/resources/simple_hbase.yaml
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/simple_hbase.yaml b/flux-examples/src/main/resources/simple_hbase.yaml
index 5eb70ed..62686d0 100644
--- a/flux-examples/src/main/resources/simple_hbase.yaml
+++ b/flux-examples/src/main/resources/simple_hbase.yaml
@@ -51,7 +51,8 @@ components:
 config:
   topology.workers: 1
   hbase.conf:
-    hbase.rootdir: "hdfs://hadoop:54310/hbase"
+    hbase.rootdir: "${hbase.rootdir}"
+    hbase.zookeeper.quorum: "${hbase.zookeeper.quorum}"
 
 # spout definitions
 spouts:

http://git-wip-us.apache.org/repos/asf/storm/blob/9f5f8227/flux-examples/src/main/resources/simple_hdfs.yaml
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/simple_hdfs.yaml b/flux-examples/src/main/resources/simple_hdfs.yaml
index ea7721d..9007869 100644
--- a/flux-examples/src/main/resources/simple_hdfs.yaml
+++ b/flux-examples/src/main/resources/simple_hdfs.yaml
@@ -41,7 +41,7 @@ components:
     className: "org.apache.storm.hdfs.bolt.format.DefaultFileNameFormat"
     configMethods:
       - name: "withPath"
-        args: [${hdfs.write.dir}]
+        args: ["${hdfs.write.dir}"]
       - name: "withExtension"
         args: [".txt"]
 
@@ -55,7 +55,7 @@ components:
     className: "org.apache.storm.hdfs.common.rotation.MoveFileAction"
     configMethods:
       - name: "toDestination"
-        args: [${hdfs.dest.dir}]
+        args: ["${hdfs.dest.dir}"]
 
 # spout definitions
 spouts:
@@ -73,7 +73,7 @@ bolts:
       - name: "withConfigKey"
         args: ["hdfs.config"]
       - name: "withFsUrl"
-        args: [${hdfs.url}]
+        args: ["${hdfs.url}"]
       - name: "withFileNameFormat"
         args: [ref: "fileNameFormat"]
       - name: "withRecordFormat"


[07/50] [abbrv] storm git commit: make examples produce a shaded jar

Posted by pt...@apache.org.
make examples produce a shaded jar


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/3411bc7d
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/3411bc7d
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/3411bc7d

Branch: refs/heads/0.10.x-branch
Commit: 3411bc7d296c948e3ca6fc1a02ef8a118df39e65
Parents: 8b690e6
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Thu Apr 2 12:20:59 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Thu Apr 2 12:20:59 2015 -0400

----------------------------------------------------------------------
 flux-examples/pom.xml | 46 +++++++++++++++++++++++++++++++++++++++++++++-
 1 file changed, 45 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/3411bc7d/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/flux-examples/pom.xml b/flux-examples/pom.xml
index 29a2e62..63bc312 100644
--- a/flux-examples/pom.xml
+++ b/flux-examples/pom.xml
@@ -15,7 +15,8 @@
  See the License for the specific language governing permissions and
  limitations under the License.
 -->
-<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
     <modelVersion>4.0.0</modelVersion>
 
     <parent>
@@ -32,4 +33,47 @@
     <name>flux-examples</name>
     <url>https://github.com/ptgoetz/flux</url>
 
+    <dependencies>
+        <dependency>
+            <groupId>com.github.ptgoetz</groupId>
+            <artifactId>flux-core</artifactId>
+            <version>${project.version}</version>
+        </dependency>
+        <dependency>
+            <groupId>com.github.ptgoetz</groupId>
+            <artifactId>flux-wrappers</artifactId>
+            <version>${project.version}</version>
+        </dependency>
+    </dependencies>
+
+    <build>
+        <plugins>
+            <plugin>
+                <groupId>org.apache.maven.plugins</groupId>
+                <artifactId>maven-shade-plugin</artifactId>
+                <version>1.4</version>
+                <configuration>
+                    <createDependencyReducedPom>true</createDependencyReducedPom>
+                </configuration>
+                <executions>
+                    <execution>
+                        <phase>package</phase>
+                        <goals>
+                            <goal>shade</goal>
+                        </goals>
+                        <configuration>
+                            <transformers>
+                                <transformer
+                                        implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
+                                <transformer
+                                        implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
+                                    <mainClass>org.apache.storm.flux.Flux</mainClass>
+                                </transformer>
+                            </transformers>
+                        </configuration>
+                    </execution>
+                </executions>
+            </plugin>
+        </plugins>
+    </build>
 </project>


[17/50] [abbrv] storm git commit: document how to build from source

Posted by pt...@apache.org.
document how to build from source


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/601cee72
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/601cee72
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/601cee72

Branch: refs/heads/0.10.x-branch
Commit: 601cee72aba6d9130bd02a08617f973e14c61eea
Parents: f432abf
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue May 5 15:39:25 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue May 5 15:39:25 2015 -0400

----------------------------------------------------------------------
 README.md                                       | 33 ++++++++++++++++++++
 .../apache/storm/flux/model/TopologyDef.java    |  1 -
 2 files changed, 33 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/601cee72/README.md
----------------------------------------------------------------------
diff --git a/README.md b/README.md
index 92fec10..6683848 100644
--- a/README.md
+++ b/README.md
@@ -67,6 +67,39 @@ the layout and configuration of your topologies.
 To use Flux, add it as a dependency and package all your Storm components in a fat jar, then create a YAML document
 to define your topology (see below for YAML configuration options).
 
+### Building from Source
+The easiest way to use Flux, is to add it as a Maven dependency in you project as described below.
+
+If you would like to build Flux from source and run the unit/integration tests, you will need the following installed
+on your system:
+
+* Python 2.6.x or later
+* Node.js 0.10.x or later
+
+#### Building with unit tests enabled:
+
+```
+mvn clean install
+```
+
+#### Building with unit tests disabled:
+If you would like to build Flux without installing Python or Node.js you can simply skip the unit tests:
+
+```
+mvn clean install -DskipTests=true
+```
+
+Note that if you plan on using Flux to deploy topologies to a remote cluster, you will still need to have Python
+installed since it is required by Apache Storm.
+
+
+#### Building with integration tests enabled:
+
+```
+mvn clean install -DskipIntegration=false
+```
+
+
 ### Packaging with Maven
 To enable Flux for your Storm components, you need to add it as a dependency such that it's included in the Storm
 topology jar. This can be accomplished with the Maven shade plugin (preferred) or the Maven assembly plugin (not

http://git-wip-us.apache.org/repos/asf/storm/blob/601cee72/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
----------------------------------------------------------------------
diff --git a/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java b/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
index 6c34018..a6ae450 100644
--- a/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
+++ b/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
@@ -45,7 +45,6 @@ public class TopologyDef {
     private TopologySourceDef topologySource;
 
     // the following are required if we're defining a core storm topology DAG in YAML, etc.
-    //TODO if any of these are specified and `topologySource != null` it should be considered an error.
     private Map<String, BoltDef> boltMap = new LinkedHashMap<String, BoltDef>();
     private Map<String, SpoutDef> spoutMap = new LinkedHashMap<String, SpoutDef>();
     private List<StreamDef> streams = new ArrayList<StreamDef>();


[28/50] [abbrv] storm git commit: add flux to binary distribution

Posted by pt...@apache.org.
add flux to binary distribution


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/e1e1609d
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/e1e1609d
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/e1e1609d

Branch: refs/heads/0.10.x-branch
Commit: e1e1609dd3642b820fe6f514b25454dedd446836
Parents: 2094a08
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed May 6 14:30:38 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed May 6 14:30:38 2015 -0400

----------------------------------------------------------------------
 storm-dist/binary/src/main/assembly/binary.xml | 44 +++++++++++++++++++++
 1 file changed, 44 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/e1e1609d/storm-dist/binary/src/main/assembly/binary.xml
----------------------------------------------------------------------
diff --git a/storm-dist/binary/src/main/assembly/binary.xml b/storm-dist/binary/src/main/assembly/binary.xml
index fef56bb..31d8b9e 100644
--- a/storm-dist/binary/src/main/assembly/binary.xml
+++ b/storm-dist/binary/src/main/assembly/binary.xml
@@ -159,6 +159,50 @@
             </includes>
         </fileSet>
 
+        <fileSet>
+            <directory>${project.basedir}/../../external/flux/flux-core/target</directory>
+            <outputDirectory>external/flux</outputDirectory>
+            <includes>
+                <include>flux*jar</include>
+            </includes>
+        </fileSet>
+        <fileSet>
+            <directory>${project.basedir}/../../external/flux/flux-wrappers/target</directory>
+            <outputDirectory>external/flux</outputDirectory>
+            <includes>
+                <include>flux*jar</include>
+            </includes>
+        </fileSet>
+        <fileSet>
+            <directory>${project.basedir}/../../external/flux/flux-examples/target</directory>
+            <outputDirectory>external/flux</outputDirectory>
+            <includes>
+                <include>flux*jar</include>
+            </includes>
+        </fileSet>
+        <fileSet>
+            <directory>${project.basedir}/../../external/flux/flux-examples</directory>
+            <outputDirectory>external/flux/examples</outputDirectory>
+            <includes>
+                <include>README.*</include>
+            </includes>
+        </fileSet>
+        <fileSet>
+            <directory>${project.basedir}/../../external/flux/flux-examples/src/main/resources</directory>
+            <outputDirectory>external/flux/examples</outputDirectory>
+            <includes>
+                <include>*</include>
+            </includes>
+        </fileSet>
+        <fileSet>
+            <directory>${project.basedir}/../../external/flux</directory>
+            <outputDirectory>external/flux</outputDirectory>
+            <includes>
+                <include>README.*</include>
+            </includes>
+        </fileSet>
+
+
         <!-- $STORM_HOME/extlib -->
         <fileSet>
             <directory></directory>


[20/50] [abbrv] storm git commit: update readme for release 0.3.0

Posted by pt...@apache.org.
update readme for release 0.3.0


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/8e0f167a
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/8e0f167a
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/8e0f167a

Branch: refs/heads/0.10.x-branch
Commit: 8e0f167adf3155f55a7c24875e5e7542615e0c5a
Parents: 01a66bf
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue May 5 16:52:03 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue May 5 16:52:03 2015 -0400

----------------------------------------------------------------------
 README.md | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/8e0f167a/README.md
----------------------------------------------------------------------
diff --git a/README.md b/README.md
index 6683848..6f27219 100644
--- a/README.md
+++ b/README.md
@@ -111,7 +111,7 @@ The current version of Flux is available in Maven Central at the following coord
 <dependency>
     <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux-core</artifactId>
-    <version>0.2.2</version>
+    <version>0.3.0</version>
 </dependency>
 ```
 
@@ -125,7 +125,7 @@ The example below illustrates Flux usage with the Maven shade plugin:
     <dependency>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux-core</artifactId>
-        <version>0.2.2</version>
+        <version>0.3.0</version>
     </dependency>
 
     <!-- add user dependencies here... -->
@@ -227,7 +227,7 @@ storm jar myTopology-0.1.0-SNAPSHOT.jar org.apache.storm.flux.Flux --remote my_c
 ╚═╝     ╚══════╝ ╚═════╝ ╚═╝  ╚═╝
 +-         Apache Storm        -+
 +-  data FLow User eXperience  -+
-Version: 0.2.2
+Version: 0.3.0
 Parsing file: /Users/hsimpson/Projects/donut_domination/storm/shell_test.yaml
 ---------- TOPOLOGY DETAILS ----------
 Name: shell-topology


[23/50] [abbrv] storm git commit: merge flux into external/flux/

Posted by pt...@apache.org.
http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/src/main/resources/simple_hdfs.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/src/main/resources/simple_hdfs.yaml
index 0000000,0000000..9007869
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/src/main/resources/simple_hdfs.yaml
@@@ -1,0 -1,0 +1,105 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++# Test ability to wire together shell spouts/bolts
++---
++
++# topology definition
++# name to be used when submitting
++name: "hdfs-topology"
++
++# Components
++# Components are analagous to Spring beans. They are meant to be used as constructor,
++# property(setter), and builder arguments.
++#
++# for the time being, components must be declared in the order they are referenced
++components:
++  - id: "syncPolicy"
++    className: "org.apache.storm.hdfs.bolt.sync.CountSyncPolicy"
++    constructorArgs:
++      - 1000
++  - id: "rotationPolicy"
++    className: "org.apache.storm.hdfs.bolt.rotation.TimedRotationPolicy"
++    constructorArgs:
++      - 30
++      - SECONDS
++
++  - id: "fileNameFormat"
++    className: "org.apache.storm.hdfs.bolt.format.DefaultFileNameFormat"
++    configMethods:
++      - name: "withPath"
++        args: ["${hdfs.write.dir}"]
++      - name: "withExtension"
++        args: [".txt"]
++
++  - id: "recordFormat"
++    className: "org.apache.storm.hdfs.bolt.format.DelimitedRecordFormat"
++    configMethods:
++      - name: "withFieldDelimiter"
++        args: ["|"]
++
++  - id: "rotationAction"
++    className: "org.apache.storm.hdfs.common.rotation.MoveFileAction"
++    configMethods:
++      - name: "toDestination"
++        args: ["${hdfs.dest.dir}"]
++
++# spout definitions
++spouts:
++  - id: "spout-1"
++    className: "backtype.storm.testing.TestWordSpout"
++    parallelism: 1
++    # ...
++
++# bolt definitions
++
++bolts:
++  - id: "bolt-1"
++    className: "org.apache.storm.hdfs.bolt.HdfsBolt"
++    configMethods:
++      - name: "withConfigKey"
++        args: ["hdfs.config"]
++      - name: "withFsUrl"
++        args: ["${hdfs.url}"]
++      - name: "withFileNameFormat"
++        args: [ref: "fileNameFormat"]
++      - name: "withRecordFormat"
++        args: [ref: "recordFormat"]
++      - name: "withRotationPolicy"
++        args: [ref: "rotationPolicy"]
++      - name: "withSyncPolicy"
++        args: [ref: "syncPolicy"]
++      - name: "addRotationAction"
++        args: [ref: "rotationAction"]
++    parallelism: 1
++    # ...
++
++  - id: "bolt-2"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++
++streams:
++  - name: "" # name isn't used (placeholder for logging, UI, etc.)
++    from: "spout-1"
++    to: "bolt-1"
++    grouping:
++      type: SHUFFLE
++
++  - name: "" # name isn't used (placeholder for logging, UI, etc.)
++    from: "spout-1"
++    to: "bolt-2"
++    grouping:
++      type: SHUFFLE

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-examples/src/main/resources/simple_wordcount.yaml
----------------------------------------------------------------------
diff --cc external/flux/flux-examples/src/main/resources/simple_wordcount.yaml
index 0000000,0000000..380f9d2
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-examples/src/main/resources/simple_wordcount.yaml
@@@ -1,0 -1,0 +1,68 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++---
++
++# topology definition
++# name to be used when submitting
++name: "yaml-topology"
++
++# topology configuration
++# this will be passed to the submitter as a map of config options
++#
++config:
++  topology.workers: 1
++
++# spout definitions
++spouts:
++  - id: "spout-1"
++    className: "backtype.storm.testing.TestWordSpout"
++    parallelism: 1
++
++# bolt definitions
++bolts:
++  - id: "bolt-1"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++
++  - id: "bolt-2"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++streams:
++  - name: "spout-1 --> bolt-1" # name isn't used (placeholder for logging, UI, etc.)
++#    id: "connection-1"
++    from: "spout-1"
++    to: "bolt-1"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "bolt-1 --> bolt2"
++    from: "bolt-1"
++    to: "bolt-2"
++    grouping:
++      type: SHUFFLE
++
++
++
++
++
++
++

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-ui/README.md
----------------------------------------------------------------------
diff --cc external/flux/flux-ui/README.md
index 0000000,0000000..8b6bd5f
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-ui/README.md
@@@ -1,0 -1,0 +1,3 @@@
++# Flux-UI
++
++Placeholder for Flux GUI

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-wrappers/pom.xml
----------------------------------------------------------------------
diff --cc external/flux/flux-wrappers/pom.xml
index 0000000,0000000..6784141
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-wrappers/pom.xml
@@@ -1,0 -1,0 +1,35 @@@
++<?xml version="1.0" encoding="UTF-8"?>
++<!--
++ Licensed to the Apache Software Foundation (ASF) under one or more
++ contributor license agreements.  See the NOTICE file distributed with
++ this work for additional information regarding copyright ownership.
++ The ASF licenses this file to You under the Apache License, Version 2.0
++ (the "License"); you may not use this file except in compliance with
++ the License.  You may obtain a copy of the License at
++
++     http://www.apache.org/licenses/LICENSE-2.0
++
++ Unless required by applicable law or agreed to in writing, software
++ distributed under the License is distributed on an "AS IS" BASIS,
++ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ See the License for the specific language governing permissions and
++ limitations under the License.
++-->
++<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
++    <modelVersion>4.0.0</modelVersion>
++
++    <parent>
++        <groupId>com.github.ptgoetz</groupId>
++        <artifactId>flux</artifactId>
++        <version>0.3.1-SNAPSHOT</version>
++        <relativePath>../pom.xml</relativePath>
++    </parent>
++
++    <groupId>com.github.ptgoetz</groupId>
++    <artifactId>flux-wrappers</artifactId>
++    <packaging>jar</packaging>
++
++    <name>flux-wrappers</name>
++    <url>https://github.com/ptgoetz/flux</url>
++
++</project>

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/FluxShellBolt.java
----------------------------------------------------------------------
diff --cc external/flux/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/FluxShellBolt.java
index 0000000,0000000..4e0f91c
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/FluxShellBolt.java
@@@ -1,0 -1,0 +1,56 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.wrappers.bolts;
++
++import backtype.storm.task.ShellBolt;
++import backtype.storm.topology.IRichBolt;
++import backtype.storm.topology.OutputFieldsDeclarer;
++import backtype.storm.tuple.Fields;
++
++import java.util.Map;
++
++/**
++ * A generic `ShellBolt` implementation that allows you specify output fields
++ * without having to subclass `ShellBolt` to do so.
++ *
++ */
++public class FluxShellBolt extends ShellBolt implements IRichBolt{
++    private String[] outputFields;
++    private Map<String, Object> componentConfig;
++
++    /**
++     * Create a ShellBolt with command line arguments and output fields
++     * @param command Command line arguments for the bolt
++     * @param outputFields Names of fields the bolt will emit (if any).
++     */
++
++    public FluxShellBolt(String[] command, String[] outputFields){
++        super(command);
++        this.outputFields = outputFields;
++    }
++
++    @Override
++    public void declareOutputFields(OutputFieldsDeclarer declarer) {
++        declarer.declare(new Fields(this.outputFields));
++    }
++
++    @Override
++    public Map<String, Object> getComponentConfiguration() {
++        return this.componentConfig;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/LogInfoBolt.java
----------------------------------------------------------------------
diff --cc external/flux/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/LogInfoBolt.java
index 0000000,0000000..a42d7c3
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/bolts/LogInfoBolt.java
@@@ -1,0 -1,0 +1,44 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++
++package org.apache.storm.flux.wrappers.bolts;
++
++import backtype.storm.topology.BasicOutputCollector;
++import backtype.storm.topology.OutputFieldsDeclarer;
++import backtype.storm.topology.base.BaseBasicBolt;
++import backtype.storm.tuple.Tuple;
++import org.slf4j.Logger;
++import org.slf4j.LoggerFactory;
++
++/**
++ * Simple bolt that does nothing other than LOG.info() every tuple recieveed.
++ *
++ */
++public class LogInfoBolt extends BaseBasicBolt {
++    private static final Logger LOG = LoggerFactory.getLogger(LogInfoBolt.class);
++
++    @Override
++    public void execute(Tuple tuple, BasicOutputCollector basicOutputCollector) {
++       LOG.info("{}", tuple);
++    }
++
++    @Override
++    public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
++
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/spouts/FluxShellSpout.java
----------------------------------------------------------------------
diff --cc external/flux/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/spouts/FluxShellSpout.java
index 0000000,0000000..c7e9058
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-wrappers/src/main/java/org/apache/storm/flux/wrappers/spouts/FluxShellSpout.java
@@@ -1,0 -1,0 +1,55 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux.wrappers.spouts;
++
++import backtype.storm.spout.ShellSpout;
++import backtype.storm.topology.IRichSpout;
++import backtype.storm.topology.OutputFieldsDeclarer;
++import backtype.storm.tuple.Fields;
++
++import java.util.Map;
++
++/**
++ * A generic `ShellSpout` implementation that allows you specify output fields
++ * without having to subclass `ShellSpout` to do so.
++ *
++ */
++public class FluxShellSpout extends ShellSpout implements IRichSpout {
++    private String[] outputFields;
++    private Map<String, Object> componentConfig;
++
++    /**
++     * Create a ShellSpout with command line arguments and output fields
++     * @param args Command line arguments for the spout
++     * @param outputFields Names of fields the spout will emit.
++     */
++    public FluxShellSpout(String[] args, String[] outputFields){
++        super(args);
++        this.outputFields = outputFields;
++    }
++
++    @Override
++    public void declareOutputFields(OutputFieldsDeclarer declarer) {
++        declarer.declare(new Fields(this.outputFields));
++    }
++
++    @Override
++    public Map<String, Object> getComponentConfiguration() {
++        return this.componentConfig;
++    }
++}

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js
----------------------------------------------------------------------
diff --cc external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js
index 0000000,0000000..36fc5f5
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js
@@@ -1,0 -1,0 +1,93 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++
++/**
++ * Example for storm spout. Emits random sentences.
++ * The original class in java - storm.starter.spout.RandomSentenceSpout.
++ *
++ */
++
++var storm = require('./storm');
++var Spout = storm.Spout;
++
++
++var SENTENCES = [
++    "the cow jumped over the moon",
++    "an apple a day keeps the doctor away",
++    "four score and seven years ago",
++    "snow white and the seven dwarfs",
++    "i am at two with nature"]
++
++function RandomSentenceSpout(sentences) {
++    Spout.call(this);
++    this.runningTupleId = 0;
++    this.sentences = sentences;
++    this.pending = {};
++};
++
++RandomSentenceSpout.prototype = Object.create(Spout.prototype);
++RandomSentenceSpout.prototype.constructor = RandomSentenceSpout;
++
++RandomSentenceSpout.prototype.getRandomSentence = function() {
++    return this.sentences[getRandomInt(0, this.sentences.length - 1)];
++}
++
++RandomSentenceSpout.prototype.nextTuple = function(done) {
++    var self = this;
++    var sentence = this.getRandomSentence();
++    var tup = [sentence];
++    var id = this.createNextTupleId();
++    this.pending[id] = tup;
++    //This timeout can be removed if TOPOLOGY_SLEEP_SPOUT_WAIT_STRATEGY_TIME_MS is configured to 100
++    setTimeout(function() {
++        self.emit({tuple: tup, id: id}, function(taskIds) {
++            self.log(tup + ' sent to task ids - ' + taskIds);
++        });
++        done();
++    },100);
++}
++
++RandomSentenceSpout.prototype.createNextTupleId = function() {
++    var id = this.runningTupleId;
++    this.runningTupleId++;
++    return id;
++}
++
++RandomSentenceSpout.prototype.ack = function(id, done) {
++    this.log('Received ack for - ' + id);
++    delete this.pending[id];
++    done();
++}
++
++RandomSentenceSpout.prototype.fail = function(id, done) {
++    var self = this;
++    this.log('Received fail for - ' + id + '. Retrying.');
++    this.emit({tuple: this.pending[id], id:id}, function(taskIds) {
++        self.log(self.pending[id] + ' sent to task ids - ' + taskIds);
++    });
++    done();
++}
++
++/**
++ * Returns a random integer between min (inclusive) and max (inclusive)
++ */
++function getRandomInt(min, max) {
++    return Math.floor(Math.random() * (max - min + 1)) + min;
++}
++
++new RandomSentenceSpout(SENTENCES).run();

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py
----------------------------------------------------------------------
diff --cc external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py
index 0000000,0000000..300105f
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py
@@@ -1,0 -1,0 +1,24 @@@
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++import storm
++
++class SplitSentenceBolt(storm.BasicBolt):
++    def process(self, tup):
++        words = tup.values[0].split(" ")
++        for word in words:
++          storm.emit([word])
++
++SplitSentenceBolt().run()

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-wrappers/src/main/resources/resources/storm.js
----------------------------------------------------------------------
diff --cc external/flux/flux-wrappers/src/main/resources/resources/storm.js
index 0000000,0000000..355c2d2
new file mode 100755
--- /dev/null
+++ b/external/flux/flux-wrappers/src/main/resources/resources/storm.js
@@@ -1,0 -1,0 +1,373 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++
++/**
++ * Base classes in node-js for storm Bolt and Spout.
++ * Implements the storm multilang protocol for nodejs.
++ */
++
++
++var fs = require('fs');
++
++function Storm() {
++    this.messagePart = "";
++    this.taskIdsCallbacks = [];
++    this.isFirstMessage = true;
++    this.separator = '\nend\n';
++}
++
++Storm.prototype.sendMsgToParent = function(msg) {
++    var str = JSON.stringify(msg);
++    process.stdout.write(str + this.separator);
++}
++
++Storm.prototype.sync = function() {
++    this.sendMsgToParent({"command":"sync"});
++}
++
++Storm.prototype.sendPid = function(heartbeatdir) {
++    var pid = process.pid;
++    fs.closeSync(fs.openSync(heartbeatdir + "/" + pid, "w"));
++    this.sendMsgToParent({"pid": pid})
++}
++
++Storm.prototype.log = function(msg) {
++    this.sendMsgToParent({"command": "log", "msg": msg});
++}
++
++Storm.prototype.initSetupInfo = function(setupInfo) {
++    var self = this;
++    var callback = function() {
++        self.sendPid(setupInfo['pidDir']);
++    }
++    this.initialize(setupInfo['conf'], setupInfo['context'], callback);
++}
++
++Storm.prototype.startReadingInput = function() {
++    var self = this;
++    process.stdin.on('readable', function() {
++        var chunk = process.stdin.read();
++        var messages = self.handleNewChunk(chunk);
++        messages.forEach(function(message) {
++            self.handleNewMessage(message);
++        })
++
++    });
++}
++
++/**
++ * receives a new string chunk and returns a list of new messages with the separator removed
++ * stores state in this.messagePart
++ * @param chunk
++ */
++Storm.prototype.handleNewChunk = function(chunk) {
++    //invariant: this.messagePart has no separator otherwise we would have parsed it already
++    var messages = [];
++    if (chunk && chunk.length !== 0) {
++        //"{}".split("\nend\n")           ==> ['{}']
++        //"\nend\n".split("\nend\n")      ==> [''  , '']
++        //"{}\nend\n".split("\nend\n")    ==> ['{}', '']
++        //"\nend\n{}".split("\nend\n")    ==> [''  , '{}']
++        // "{}\nend\n{}".split("\nend\n") ==> ['{}', '{}' ]
++        this.messagePart = this.messagePart + chunk;
++        var newMessageParts = this.messagePart.split(this.separator);
++        while (newMessageParts.length > 0) {
++            var potentialMessage = newMessageParts.shift();
++            var anotherMessageAhead = newMessageParts.length > 0;
++            if  (!anotherMessageAhead) {
++                this.messagePart = potentialMessage;
++            }
++            else if (potentialMessage.length > 0) {
++                messages.push(potentialMessage);
++            }
++        }
++    }
++    return messages;
++}
++
++Storm.prototype.isTaskIds = function(msg) {
++    return (msg instanceof Array);
++}
++
++Storm.prototype.handleNewMessage = function(msg) {
++    var parsedMsg = JSON.parse(msg);
++
++    if (this.isFirstMessage) {
++        this.initSetupInfo(parsedMsg);
++        this.isFirstMessage = false;
++    } else if (this.isTaskIds(parsedMsg)) {
++        this.handleNewTaskId(parsedMsg);
++    } else {
++        this.handleNewCommand(parsedMsg);
++    }
++}
++
++Storm.prototype.handleNewTaskId = function(taskIds) {
++    //When new list of task ids arrives, the callback that was passed with the corresponding emit should be called.
++    //Storm assures that the task ids will be sent in the same order as their corresponding emits so it we can simply
++    //take the first callback in the list and be sure it is the right one.
++
++    var callback = this.taskIdsCallbacks.shift();
++    if (callback) {
++        callback(taskIds);
++    } else {
++        throw new Error('Something went wrong, we off the split of task id callbacks');
++    }
++}
++
++
++
++/**
++ *
++ * @param messageDetails json with the emit details.
++ *
++ * For bolt, the json must contain the required fields:
++ * - tuple - the value to emit
++ * - anchorTupleId - the value of the anchor tuple (the input tuple that lead to this emit). Used to track the source
++ * tuple and return ack when all components successfully finished to process it.
++ * and may contain the optional fields:
++ * - stream (if empty - emit to default stream)
++ *
++ * For spout, the json must contain the required fields:
++ * - tuple - the value to emit
++ *
++ * and may contain the optional fields:
++ * - id - pass id for reliable emit (and receive ack/fail later).
++ * - stream - if empty - emit to default stream.
++ *
++ * @param onTaskIds function than will be called with list of task ids the message was emitted to (when received).
++ */
++Storm.prototype.emit = function(messageDetails, onTaskIds) {
++    //Every emit triggers a response - list of task ids to which the tuple was emitted. The task ids are accessible
++    //through the callback (will be called when the response arrives). The callback is stored in a list until the
++    //corresponding task id list arrives.
++    if (messageDetails.task) {
++        throw new Error('Illegal input - task. To emit to specific task use emit direct!');
++    }
++
++    if (!onTaskIds) {
++        throw new Error('You must pass a onTaskIds callback when using emit!')
++    }
++
++    this.taskIdsCallbacks.push(onTaskIds);
++    this.__emit(messageDetails);;
++}
++
++
++/**
++ * Emit message to specific task.
++ * @param messageDetails json with the emit details.
++ *
++ * For bolt, the json must contain the required fields:
++ * - tuple - the value to emit
++ * - anchorTupleId - the value of the anchor tuple (the input tuple that lead to this emit). Used to track the source
++ * tuple and return ack when all components successfully finished to process it.
++ * - task - indicate the task to send the tuple to.
++ * and may contain the optional fields:
++ * - stream (if empty - emit to default stream)
++ *
++ * For spout, the json must contain the required fields:
++ * - tuple - the value to emit
++ * - task - indicate the task to send the tuple to.
++ * and may contain the optional fields:
++ * - id - pass id for reliable emit (and receive ack/fail later).
++ * - stream - if empty - emit to default stream.
++ *
++ * @param onTaskIds function than will be called with list of task ids the message was emitted to (when received).
++ */
++Storm.prototype.emitDirect = function(commandDetails) {
++    if (!commandDetails.task) {
++        throw new Error("Emit direct must receive task id!")
++    }
++    this.__emit(commandDetails);
++}
++
++/**
++ * Initialize storm component according to the configuration received.
++ * @param conf configuration object accrding to storm protocol.
++ * @param context context object according to storm protocol.
++ * @param done callback. Call this method when finished initializing.
++ */
++Storm.prototype.initialize = function(conf, context, done) {
++    done();
++}
++
++Storm.prototype.run = function() {
++    process.stdout.setEncoding('utf8');
++    process.stdin.setEncoding('utf8');
++    this.startReadingInput();
++}
++
++function Tuple(id, component, stream, task, values) {
++    this.id = id;
++    this.component = component;
++    this.stream = stream;
++    this.task = task;
++    this.values = values;
++}
++
++/**
++ * Base class for storm bolt.
++ * To create a bolt implement 'process' method.
++ * You may also implement initialize method to
++ */
++function BasicBolt() {
++    Storm.call(this);
++    this.anchorTuple = null;
++};
++
++BasicBolt.prototype = Object.create(Storm.prototype);
++BasicBolt.prototype.constructor = BasicBolt;
++
++/**
++ * Emit message.
++ * @param commandDetails json with the required fields:
++ * - tuple - the value to emit
++ * - anchorTupleId - the value of the anchor tuple (the input tuple that lead to this emit). Used to track the source
++ * tuple and return ack when all components successfully finished to process it.
++ * and the optional fields:
++ * - stream (if empty - emit to default stream)
++ * - task (pass only to emit to specific task)
++ */
++BasicBolt.prototype.__emit = function(commandDetails) {
++    var self = this;
++
++    var message = {
++        command: "emit",
++        tuple: commandDetails.tuple,
++        stream: commandDetails.stream,
++        task: commandDetails.task,
++        anchors: [commandDetails.anchorTupleId]
++    };
++
++    this.sendMsgToParent(message);
++}
++
++BasicBolt.prototype.handleNewCommand = function(command) {
++    var self = this;
++    var tup = new Tuple(command["id"], command["comp"], command["stream"], command["task"], command["tuple"]);
++
++    if (tup.task === -1 && tup.stream === "__heartbeat") {
++        self.sync();
++        return;
++    }
++
++    var callback = function(err) {
++        if (err) {
++            self.fail(tup, err);
++            return;
++        }
++        self.ack(tup);
++    }
++    this.process(tup, callback);
++}
++
++/**
++ * Implement this method when creating a bolt. This is the main method that provides the logic of the bolt (what
++ * should it do?).
++ * @param tuple the input of the bolt - what to process.
++ * @param done call this method when done processing.
++ */
++BasicBolt.prototype.process = function(tuple, done) {};
++
++BasicBolt.prototype.ack = function(tup) {
++    this.sendMsgToParent({"command": "ack", "id": tup.id});
++}
++
++BasicBolt.prototype.fail = function(tup, err) {
++    this.sendMsgToParent({"command": "fail", "id": tup.id});
++}
++
++
++/**
++ * Base class for storm spout.
++ * To create a spout implement the following methods: nextTuple, ack and fail (nextTuple - mandatory, ack and fail
++ * can stay empty).
++ * You may also implement initialize method.
++ *
++ */
++function Spout() {
++    Storm.call(this);
++};
++
++Spout.prototype = Object.create(Storm.prototype);
++
++Spout.prototype.constructor = Spout;
++
++/**
++ * This method will be called when an ack is received for preciously sent tuple. One may implement it.
++ * @param id The id of the tuple.
++ * @param done Call this method when finished and ready to receive more tuples.
++ */
++Spout.prototype.ack = function(id, done) {};
++
++/**
++ * This method will be called when an fail is received for preciously sent tuple. One may implement it (for example -
++ * log the failure or send the tuple again).
++ * @param id The id of the tuple.
++ * @param done Call this method when finished and ready to receive more tuples.
++ */
++Spout.prototype.fail = function(id, done) {};
++
++/**
++ * Method the indicates its time to emit the next tuple.
++ * @param done call this method when done sending the output.
++ */
++Spout.prototype.nextTuple = function(done) {};
++
++Spout.prototype.handleNewCommand = function(command) {
++    var self = this;
++    var callback = function() {
++        self.sync();
++    }
++
++    if (command["command"] === "next") {
++        this.nextTuple(callback);
++    }
++
++    if (command["command"] === "ack") {
++        this.ack(command["id"], callback);
++    }
++
++    if (command["command"] === "fail") {
++        this.fail(command["id"], callback);
++    }
++}
++
++/**
++ * @param commandDetails json with the required fields:
++ * - tuple - the value to emit.
++ * and the optional fields:
++ * - id - pass id for reliable emit (and receive ack/fail later).
++ * - stream - if empty - emit to default stream.
++ * - task - pass only to emit to specific task.
++ */
++Spout.prototype.__emit = function(commandDetails) {
++    var message = {
++        command: "emit",
++        tuple: commandDetails.tuple,
++        id: commandDetails.id,
++        stream: commandDetails.stream,
++        task: commandDetails.task
++    };
++
++    this.sendMsgToParent(message);
++}
++
++module.exports.BasicBolt = BasicBolt;
++module.exports.Spout = Spout;

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-wrappers/src/main/resources/resources/storm.py
----------------------------------------------------------------------
diff --cc external/flux/flux-wrappers/src/main/resources/resources/storm.py
index 0000000,0000000..642c393
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-wrappers/src/main/resources/resources/storm.py
@@@ -1,0 -1,0 +1,260 @@@
++# -*- coding: utf-8 -*-
++
++# Licensed to the Apache Software Foundation (ASF) under one
++# or more contributor license agreements.  See the NOTICE file
++# distributed with this work for additional information
++# regarding copyright ownership.  The ASF licenses this file
++# to you under the Apache License, Version 2.0 (the
++# "License"); you may not use this file except in compliance
++# with the License.  You may obtain a copy of the License at
++#
++# http://www.apache.org/licenses/LICENSE-2.0
++#
++# Unless required by applicable law or agreed to in writing, software
++# distributed under the License is distributed on an "AS IS" BASIS,
++# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++# See the License for the specific language governing permissions and
++# limitations under the License.
++
++import sys
++import os
++import traceback
++from collections import deque
++
++try:
++    import simplejson as json
++except ImportError:
++    import json
++
++json_encode = lambda x: json.dumps(x)
++json_decode = lambda x: json.loads(x)
++
++#reads lines and reconstructs newlines appropriately
++def readMsg():
++    msg = ""
++    while True:
++        line = sys.stdin.readline()
++        if not line:
++            raise Exception('Read EOF from stdin')
++        if line[0:-1] == "end":
++            break
++        msg = msg + line
++    return json_decode(msg[0:-1])
++
++MODE = None
++ANCHOR_TUPLE = None
++
++#queue up commands we read while trying to read taskids
++pending_commands = deque()
++
++def readTaskIds():
++    if pending_taskids:
++        return pending_taskids.popleft()
++    else:
++        msg = readMsg()
++        while type(msg) is not list:
++            pending_commands.append(msg)
++            msg = readMsg()
++        return msg
++
++#queue up taskids we read while trying to read commands/tuples
++pending_taskids = deque()
++
++def readCommand():
++    if pending_commands:
++        return pending_commands.popleft()
++    else:
++        msg = readMsg()
++        while type(msg) is list:
++            pending_taskids.append(msg)
++            msg = readMsg()
++        return msg
++
++def readTuple():
++    cmd = readCommand()
++    return Tuple(cmd["id"], cmd["comp"], cmd["stream"], cmd["task"], cmd["tuple"])
++
++def sendMsgToParent(msg):
++    print json_encode(msg)
++    print "end"
++    sys.stdout.flush()
++
++def sync():
++    sendMsgToParent({'command':'sync'})
++
++def sendpid(heartbeatdir):
++    pid = os.getpid()
++    sendMsgToParent({'pid':pid})
++    open(heartbeatdir + "/" + str(pid), "w").close()
++
++def emit(*args, **kwargs):
++    __emit(*args, **kwargs)
++    return readTaskIds()
++
++def emitDirect(task, *args, **kwargs):
++    kwargs["directTask"] = task
++    __emit(*args, **kwargs)
++
++def __emit(*args, **kwargs):
++    global MODE
++    if MODE == Bolt:
++        emitBolt(*args, **kwargs)
++    elif MODE == Spout:
++        emitSpout(*args, **kwargs)
++
++def emitBolt(tup, stream=None, anchors = [], directTask=None):
++    global ANCHOR_TUPLE
++    if ANCHOR_TUPLE is not None:
++        anchors = [ANCHOR_TUPLE]
++    m = {"command": "emit"}
++    if stream is not None:
++        m["stream"] = stream
++    m["anchors"] = map(lambda a: a.id, anchors)
++    if directTask is not None:
++        m["task"] = directTask
++    m["tuple"] = tup
++    sendMsgToParent(m)
++
++def emitSpout(tup, stream=None, id=None, directTask=None):
++    m = {"command": "emit"}
++    if id is not None:
++        m["id"] = id
++    if stream is not None:
++        m["stream"] = stream
++    if directTask is not None:
++        m["task"] = directTask
++    m["tuple"] = tup
++    sendMsgToParent(m)
++
++def ack(tup):
++    sendMsgToParent({"command": "ack", "id": tup.id})
++
++def fail(tup):
++    sendMsgToParent({"command": "fail", "id": tup.id})
++
++def reportError(msg):
++    sendMsgToParent({"command": "error", "msg": msg})
++
++def log(msg, level=2):
++    sendMsgToParent({"command": "log", "msg": msg, "level":level})
++
++def logTrace(msg):
++    log(msg, 0)
++
++def logDebug(msg):
++    log(msg, 1)
++
++def logInfo(msg):
++    log(msg, 2)
++
++def logWarn(msg):
++    log(msg, 3)
++
++def logError(msg):
++    log(msg, 4)
++
++def rpcMetrics(name, params):
++    sendMsgToParent({"command": "metrics", "name": name, "params": params})
++
++def initComponent():
++    setupInfo = readMsg()
++    sendpid(setupInfo['pidDir'])
++    return [setupInfo['conf'], setupInfo['context']]
++
++class Tuple(object):
++    def __init__(self, id, component, stream, task, values):
++        self.id = id
++        self.component = component
++        self.stream = stream
++        self.task = task
++        self.values = values
++
++    def __repr__(self):
++        return '<%s%s>' % (
++            self.__class__.__name__,
++            ''.join(' %s=%r' % (k, self.__dict__[k]) for k in sorted(self.__dict__.keys())))
++
++    def is_heartbeat_tuple(self):
++        return self.task == -1 and self.stream == "__heartbeat"
++
++class Bolt(object):
++    def initialize(self, stormconf, context):
++        pass
++
++    def process(self, tuple):
++        pass
++
++    def run(self):
++        global MODE
++        MODE = Bolt
++        conf, context = initComponent()
++        try:
++            self.initialize(conf, context)
++            while True:
++                tup = readTuple()
++                if tup.is_heartbeat_tuple():
++                    sync()
++                else:
++                    self.process(tup)
++        except Exception, e:
++            reportError(traceback.format_exc(e))
++
++class BasicBolt(object):
++    def initialize(self, stormconf, context):
++        pass
++
++    def process(self, tuple):
++        pass
++
++    def run(self):
++        global MODE
++        MODE = Bolt
++        global ANCHOR_TUPLE
++        conf, context = initComponent()
++        try:
++            self.initialize(conf, context)
++            while True:
++                tup = readTuple()
++                if tup.is_heartbeat_tuple():
++                    sync()
++                else:
++                    ANCHOR_TUPLE = tup
++                    try:
++                        self.process(tup)
++                        ack(tup)
++                    except Exception, e:
++                        reportError(traceback.format_exc(e))
++                        fail(tup)
++        except Exception, e:
++            reportError(traceback.format_exc(e))
++
++class Spout(object):
++    def initialize(self, conf, context):
++        pass
++
++    def ack(self, id):
++        pass
++
++    def fail(self, id):
++        pass
++
++    def nextTuple(self):
++        pass
++
++    def run(self):
++        global MODE
++        MODE = Spout
++        conf, context = initComponent()
++        try:
++            self.initialize(conf, context)
++            while True:
++                msg = readCommand()
++                if msg["command"] == "next":
++                    self.nextTuple()
++                if msg["command"] == "ack":
++                    self.ack(msg["id"])
++                if msg["command"] == "fail":
++                    self.fail(msg["id"])
++                sync()
++        except Exception, e:
++            reportError(traceback.format_exc(e))

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/pom.xml
----------------------------------------------------------------------
diff --cc external/flux/pom.xml
index 0000000,0000000..5ea1b40
new file mode 100644
--- /dev/null
+++ b/external/flux/pom.xml
@@@ -1,0 -1,0 +1,126 @@@
++<?xml version="1.0" encoding="UTF-8"?>
++<!--
++ Licensed to the Apache Software Foundation (ASF) under one or more
++ contributor license agreements.  See the NOTICE file distributed with
++ this work for additional information regarding copyright ownership.
++ The ASF licenses this file to You under the Apache License, Version 2.0
++ (the "License"); you may not use this file except in compliance with
++ the License.  You may obtain a copy of the License at
++
++     http://www.apache.org/licenses/LICENSE-2.0
++
++ Unless required by applicable law or agreed to in writing, software
++ distributed under the License is distributed on an "AS IS" BASIS,
++ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ See the License for the specific language governing permissions and
++ limitations under the License.
++-->
++<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
++    <modelVersion>4.0.0</modelVersion>
++
++    <groupId>com.github.ptgoetz</groupId>
++    <artifactId>flux</artifactId>
++    <version>0.3.1-SNAPSHOT</version>
++    <packaging>pom</packaging>
++    <name>flux</name>
++    <url>https://github.com/ptgoetz/flux</url>
++
++    <parent>
++        <groupId>org.sonatype.oss</groupId>
++        <artifactId>oss-parent</artifactId>
++        <version>7</version>
++    </parent>
++    <scm>
++        <connection>scm:git:git@github.com:ptgoetz/flux.git</connection>
++        <developerConnection>scm:git:git@github.com:ptgoetz/flux.git</developerConnection>
++        <url>:git@github.com:ptgoetz/flux.git</url>
++    </scm>
++
++    <developers>
++        <developer>
++            <id>ptgoetz</id>
++            <name>P. Taylor Goetz</name>
++            <email>ptgoetz@apache.org</email>
++        </developer>
++    </developers>
++
++    <properties>
++        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
++        <storm.version>0.9.3</storm.version>
++        <!-- see comment below... This fixes an annoyance with intellij -->
++        <provided.scope>provided</provided.scope>
++    </properties>
++
++    <profiles>
++        <!--
++            Hack to make intellij behave.
++            If you use intellij, enable this profile in your IDE.
++            It should make life easier.
++        -->
++        <profile>
++            <id>intellij</id>
++            <properties>
++                <provided.scope>compile</provided.scope>
++            </properties>
++        </profile>
++    </profiles>
++
++    <modules>
++        <module>flux-wrappers</module>
++        <module>flux-core</module>
++        <module>flux-examples</module>
++    </modules>
++
++    <dependencies>
++        <dependency>
++            <groupId>org.apache.storm</groupId>
++            <artifactId>storm-core</artifactId>
++            <version>${storm.version}</version>
++            <scope>${provided.scope}</scope>
++        </dependency>
++        <dependency>
++            <groupId>commons-cli</groupId>
++            <artifactId>commons-cli</artifactId>
++            <version>1.2</version>
++        </dependency>
++        <dependency>
++            <groupId>org.apache.kafka</groupId>
++            <artifactId>kafka_2.10</artifactId>
++            <version>0.8.1.1</version>
++            <scope>test</scope>
++            <exclusions>
++                <exclusion>
++                    <groupId>org.apache.zookeeper</groupId>
++                    <artifactId>zookeeper</artifactId>
++                </exclusion>
++                <exclusion>
++                    <groupId>log4j</groupId>
++                    <artifactId>log4j</artifactId>
++                </exclusion>
++            </exclusions>
++        </dependency>
++        <dependency>
++            <groupId>junit</groupId>
++            <artifactId>junit</artifactId>
++            <version>4.11</version>
++            <scope>test</scope>
++        </dependency>
++    </dependencies>
++    <build>
++        <resources>
++
++        </resources>
++        <plugins>
++            <plugin>
++                <groupId>org.apache.maven.plugins</groupId>
++                <artifactId>maven-compiler-plugin</artifactId>
++                <version>3.3</version>
++                <configuration>
++                    <source>1.6</source>
++                    <target>1.6</target>
++                    <encoding>UTF-8</encoding>
++                </configuration>
++            </plugin>
++        </plugins>
++    </build>
++</project>


[35/50] [abbrv] storm git commit: Remove client adding reference eventhubs-client package and use ResilientEventHubReceiver

Posted by pt...@apache.org.
Remove client adding reference eventhubs-client package and use ResilientEventHubReceiver

Signed-off-by: Shanyu Zhao <sh...@microsoft.com>


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/85aeb3d4
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/85aeb3d4
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/85aeb3d4

Branch: refs/heads/0.10.x-branch
Commit: 85aeb3d48efcaa28d6fc5dbfe6ce87af9f3e2615
Parents: 1f13f15
Author: Shanyu Zhao <sh...@microsoft.com>
Authored: Sun May 17 01:14:01 2015 -0700
Committer: Shanyu Zhao <sh...@microsoft.com>
Committed: Sun May 17 01:14:01 2015 -0700

----------------------------------------------------------------------
 external/storm-eventhubs/pom.xml                |  13 +-
 .../storm/eventhubs/bolt/EventHubBolt.java      |   6 +-
 .../eventhubs/bolt/EventHubBoltConfig.java      |   4 +-
 .../client/ConnectionStringBuilder.java         | 116 ----------------
 .../storm/eventhubs/client/Constants.java       |  32 -----
 .../storm/eventhubs/client/EventHubClient.java  |  95 -------------
 .../eventhubs/client/EventHubConsumerGroup.java |  72 ----------
 .../eventhubs/client/EventHubException.java     |  37 -----
 .../eventhubs/client/EventHubReceiver.java      | 139 -------------------
 .../eventhubs/client/EventHubSendClient.java    |  70 ----------
 .../storm/eventhubs/client/EventHubSender.java  |  99 -------------
 .../storm/eventhubs/client/SelectorFilter.java  |  38 -----
 .../eventhubs/client/SelectorFilterWriter.java  |  64 ---------
 .../eventhubs/spout/EventHubReceiverFilter.java |  56 --------
 .../eventhubs/spout/EventHubReceiverImpl.java   |  50 +++----
 .../eventhubs/spout/EventHubSpoutConfig.java    |  31 +----
 .../eventhubs/spout/IEventHubReceiver.java      |   5 +-
 .../spout/IEventHubReceiverFilter.java          |  35 -----
 .../eventhubs/spout/SimplePartitionManager.java |  11 +-
 .../spout/StaticPartitionCoordinator.java       |   2 +-
 .../TransactionalTridentEventHubEmitter.java    |   2 +-
 .../trident/TridentPartitionManager.java        |  12 +-
 .../eventhubs/spout/EventHubReceiverMock.java   |  18 +--
 23 files changed, 56 insertions(+), 951 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/pom.xml
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/pom.xml b/external/storm-eventhubs/pom.xml
index 2dfb739..6d4a47b 100755
--- a/external/storm-eventhubs/pom.xml
+++ b/external/storm-eventhubs/pom.xml
@@ -33,7 +33,7 @@
 
     <properties>
         <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
-        <qpid.version>0.32</qpid.version>
+        <eventhubs.client.version>0.9</eventhubs.client.version>
     </properties>
     <build>
         <plugins>
@@ -77,14 +77,9 @@
     </build>
     <dependencies>
         <dependency>
-            <groupId>org.apache.qpid</groupId>
-            <artifactId>qpid-client</artifactId>
-            <version>${qpid.version}</version>
-        </dependency>
-        <dependency>
-            <groupId>org.apache.qpid</groupId>
-            <artifactId>qpid-amqp-1-0-client-jms</artifactId>
-            <version>${qpid.version}</version>
+            <groupId>com.microsoft.eventhubs.client</groupId>
+            <artifactId>eventhubs-client</artifactId>
+            <version>${eventhubs.client.version}</version>
         </dependency>
         <dependency>
             <groupId>org.apache.storm</groupId>

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
index a817744..9acf7fa 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
@@ -22,9 +22,9 @@ import java.util.Map;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
-import org.apache.storm.eventhubs.client.EventHubClient;
-import org.apache.storm.eventhubs.client.EventHubException;
-import org.apache.storm.eventhubs.client.EventHubSender;
+import com.microsoft.eventhubs.client.EventHubClient;
+import com.microsoft.eventhubs.client.EventHubException;
+import com.microsoft.eventhubs.client.EventHubSender;
 
 import backtype.storm.task.OutputCollector;
 import backtype.storm.task.TopologyContext;

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
index 4383a72..10b4e39 100644
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
@@ -20,6 +20,7 @@ package org.apache.storm.eventhubs.bolt;
 import java.io.Serializable;
 
 import org.apache.storm.eventhubs.spout.EventHubSpoutConfig;
+import com.microsoft.eventhubs.client.ConnectionStringBuilder;
 
 /*
  * EventHubs bolt configurations
@@ -80,7 +81,8 @@ public class EventHubBoltConfig implements Serializable {
   public EventHubBoltConfig(String userName, String password, String namespace,
       String targetFqnAddress, String entityPath, boolean partitionMode,
       IEventDataFormat dataFormat) {
-    this.connectionString = EventHubSpoutConfig.buildConnectionString(userName, password, namespace, targetFqnAddress);
+    this.connectionString = new ConnectionStringBuilder(userName, password,
+    		namespace, targetFqnAddress).getConnectionString();
     this.entityPath = entityPath;
     this.partitionMode = partitionMode;
     this.dataFormat = dataFormat;

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/ConnectionStringBuilder.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/ConnectionStringBuilder.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/ConnectionStringBuilder.java
deleted file mode 100755
index 518c88d..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/ConnectionStringBuilder.java
+++ /dev/null
@@ -1,116 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import java.io.IOException;
-import java.net.MalformedURLException;
-import java.net.URL;
-import java.net.URLConnection;
-import java.net.URLDecoder;
-import java.net.URLStreamHandler;
-
-public class ConnectionStringBuilder {
-
-  private final String connectionString;
-
-  private String host;
-  private int port;
-  private String userName;
-  private String password;
-  private boolean ssl;
-
-  // amqps://[username]:[password]@[namespace].servicebus.windows.net/
-  public ConnectionStringBuilder(String connectionString) throws EventHubException {
-    this.connectionString = connectionString;
-    this.initialize();
-  }
-
-  public String getHost() {
-    return this.host;
-  }
-
-  public void setHost(String value) {
-    this.host = value;
-  }
-
-  public int getPort() {
-    return this.port;
-  }
-
-  public void setPort(int value) {
-    this.port = value;
-  }
-
-  public String getUserName() {
-    return this.userName;
-  }
-
-  public void setUserName(String value) {
-    this.userName = value;
-  }
-
-  public String getPassword() {
-    return this.password;
-  }
-
-  public void setPassword(String value) {
-    this.password = value;
-  }
-
-  public boolean getSsl() {
-    return this.ssl;
-  }
-
-  public void setSsl(boolean value) {
-    this.ssl = value;
-  }
-
-  private void initialize() throws EventHubException {
-
-    URL url;
-    try {
-      url = new URL(null, this.connectionString, new NullURLStreamHandler());
-    } catch (MalformedURLException e) {
-      throw new EventHubException("connectionString is not valid.", e);
-    }
-
-    String protocol = url.getProtocol();
-    this.ssl = protocol.equalsIgnoreCase(Constants.SslScheme);
-    this.host = url.getHost();
-    this.port = url.getPort();
-
-    if (this.port == -1) {
-      this.port = this.ssl ? Constants.DefaultSslPort : Constants.DefaultPort;
-    }
-
-    String userInfo = url.getUserInfo();
-    if (userInfo != null) {
-      String[] credentials = userInfo.split(":", 2);
-      this.userName = URLDecoder.decode(credentials[0]);
-      this.password = URLDecoder.decode(credentials[1]);
-    }
-  }
-
-  class NullURLStreamHandler extends URLStreamHandler {
-
-    @Override
-    protected URLConnection openConnection(URL u) throws IOException {
-      throw new UnsupportedOperationException("Not supported yet.");
-    }
-  }
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/Constants.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/Constants.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/Constants.java
deleted file mode 100755
index d87ad53..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/Constants.java
+++ /dev/null
@@ -1,32 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-public class Constants {
-
-  public static final String DefaultStartingOffset = "-1";
-  public static final String SelectorFilterName = "apache.org:selector-filter:string";
-  public static final String OffsetFilterFormatString = "amqp.annotation.x-opt-offset > '%s'";
-  public static final String EnqueueTimeFilterFormatString = "amqp.annotation.x-opt-enqueuedtimeutc > %d";
-  public static final String ConsumerAddressFormatString = "%s/ConsumerGroups/%s/Partitions/%s";
-  public static final String DestinationAddressFormatString = "%s/Partitions/%s";
-
-  public static final String SslScheme = "amqps";
-  public static final int DefaultPort = 5672;
-  public static final int DefaultSslPort = 5671;
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
deleted file mode 100755
index 564a26f..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
+++ /dev/null
@@ -1,95 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import org.apache.qpid.amqp_1_0.client.Connection;
-import org.apache.qpid.amqp_1_0.client.ConnectionErrorException;
-import org.apache.qpid.amqp_1_0.client.ConnectionException;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-public class EventHubClient {
-
-  private static final String DefaultConsumerGroupName = "$default";
-  private static final Logger logger = LoggerFactory.getLogger(EventHubClient.class);
-  private static final long ConnectionSyncTimeout = 60000L;
-
-  private final String connectionString;
-  private final String entityPath;
-  private final Connection connection;
-
-  private EventHubClient(String connectionString, String entityPath) throws EventHubException {
-    this.connectionString = connectionString;
-    this.entityPath = entityPath;
-    this.connection = this.createConnection();
-  }
-
-  /**
-   * creates a new instance of EventHubClient using the supplied connection string and entity path.
-   *
-   * @param connectionString connection string to the namespace of event hubs. connection string format:
-   * amqps://{userId}:{password}@{namespaceName}.servicebus.windows.net
-   * @param entityPath the name of event hub entity.
-   *
-   * @return EventHubClient
-   * @throws org.apache.storm.eventhubs.client.EventHubException
-   */
-  public static EventHubClient create(String connectionString, String entityPath) throws EventHubException {
-    return new EventHubClient(connectionString, entityPath);
-  }
-
-  public EventHubSender createPartitionSender(String partitionId) throws Exception {
-    return new EventHubSender(this.connection.createSession(), this.entityPath, partitionId);
-  }
-
-  public EventHubConsumerGroup getConsumerGroup(String cgName) {
-    if(cgName == null || cgName.length() == 0) {
-      cgName = DefaultConsumerGroupName;
-    }
-    return new EventHubConsumerGroup(connection, entityPath, cgName);
-  }
-
-  public void close() {
-    try {
-      this.connection.close();
-    } catch (ConnectionErrorException e) {
-      logger.error(e.toString());
-    }
-  }
-
-  private Connection createConnection() throws EventHubException {
-    ConnectionStringBuilder connectionStringBuilder = new ConnectionStringBuilder(this.connectionString);
-    Connection clientConnection;
-
-    try {
-      clientConnection = new Connection(
-        connectionStringBuilder.getHost(),
-        connectionStringBuilder.getPort(),
-        connectionStringBuilder.getUserName(),
-        connectionStringBuilder.getPassword(),
-        connectionStringBuilder.getHost(),
-        connectionStringBuilder.getSsl());
-    } catch (ConnectionException e) {
-      logger.error(e.toString());
-      throw new EventHubException(e);
-    }
-    clientConnection.getEndpoint().setSyncTimeout(ConnectionSyncTimeout);
-    SelectorFilterWriter.register(clientConnection.getEndpoint().getDescribedTypeRegistry());
-    return clientConnection;
-  }
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubConsumerGroup.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubConsumerGroup.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubConsumerGroup.java
deleted file mode 100755
index 892ff9c..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubConsumerGroup.java
+++ /dev/null
@@ -1,72 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import org.apache.qpid.amqp_1_0.client.Connection;
-import org.apache.qpid.amqp_1_0.client.ConnectionException;
-import org.apache.qpid.amqp_1_0.client.Session;
-
-public class EventHubConsumerGroup {
-
-  private final Connection connection;
-  private final String entityPath;
-  private final String consumerGroupName;
-
-  private Session session;
-
-  public EventHubConsumerGroup(Connection connection, String entityPath, String consumerGroupName) {
-    this.connection = connection;
-    this.entityPath = entityPath;
-    this.consumerGroupName = consumerGroupName;
-  }
-
-  public EventHubReceiver createReceiver(String partitionId, String startingOffset, int defaultCredits) throws EventHubException {
-    this.ensureSessionCreated();
-
-    if (startingOffset == null || startingOffset.equals("")) {
-      startingOffset = Constants.DefaultStartingOffset;
-    }
-
-    String filterStr = String.format(Constants.OffsetFilterFormatString, startingOffset);
-    return new EventHubReceiver(this.session, this.entityPath, this.consumerGroupName, partitionId, filterStr, defaultCredits);
-  }
-  
-  public EventHubReceiver createReceiver(String partitionId, long timeAfter, int defaultCredits) throws EventHubException {
-    this.ensureSessionCreated();
-
-    String filterStr = String.format(Constants.EnqueueTimeFilterFormatString, timeAfter);
-    return new EventHubReceiver(this.session, this.entityPath, this.consumerGroupName, partitionId, filterStr, defaultCredits);
-  }
-
-  public void close() {
-    if (this.session != null) {
-      this.session.close();
-    }
-  }
-
-  synchronized void ensureSessionCreated() throws EventHubException {
-
-    try {
-      if (this.session == null) {
-        this.session = this.connection.createSession();
-      }
-    } catch (ConnectionException e) {
-      throw new EventHubException(e);
-    }
-  }
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubException.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubException.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubException.java
deleted file mode 100755
index 3e94573..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubException.java
+++ /dev/null
@@ -1,37 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-public class EventHubException extends Exception {
-
-  public EventHubException() {
-    super();
-  }
-
-  public EventHubException(String message) {
-    super(message);
-  }
-
-  public EventHubException(Throwable cause) {
-    super(cause);
-  }
-
-  public EventHubException(String message, Throwable cause) {
-    super(message, cause);
-  }
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubReceiver.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubReceiver.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubReceiver.java
deleted file mode 100755
index c8900a8..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubReceiver.java
+++ /dev/null
@@ -1,139 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import java.util.Collections;
-import java.util.Map;
-import org.apache.qpid.amqp_1_0.client.AcknowledgeMode;
-import org.apache.qpid.amqp_1_0.client.ConnectionErrorException;
-import org.apache.qpid.amqp_1_0.client.Message;
-import org.apache.qpid.amqp_1_0.client.Receiver;
-import org.apache.qpid.amqp_1_0.client.Session;
-import org.apache.qpid.amqp_1_0.type.Symbol;
-import org.apache.qpid.amqp_1_0.type.UnsignedInteger;
-import org.apache.qpid.amqp_1_0.type.messaging.Filter;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-public final class EventHubReceiver {
-
-  private static final Logger logger = LoggerFactory
-      .getLogger(EventHubReceiver.class);
-  private static final String linkName = "eventhubs-receiver-link";
-
-  private final Session session;
-  private final String entityPath;
-  private final String consumerGroupName;
-  private final String partitionId;
-  private final String consumerAddress;
-  private final Map<Symbol, Filter> filters;
-  private final int defaultCredits;
-
-  private Receiver receiver;
-  private boolean isClosed;
-
-  public EventHubReceiver(Session session, String entityPath,
-      String consumerGroupName, String partitionId, String filterStr, int defaultCredits)
-      throws EventHubException {
-
-    this.session = session;
-    this.entityPath = entityPath;
-    this.consumerGroupName = consumerGroupName;
-    this.partitionId = partitionId;
-    this.consumerAddress = this.getConsumerAddress();
-    this.filters = Collections.singletonMap(
-        Symbol.valueOf(Constants.SelectorFilterName),
-        (Filter) new SelectorFilter(filterStr));
-    logger.info("receiver filter string: " + filterStr);
-    this.defaultCredits = defaultCredits;
-
-    this.ensureReceiverCreated();
-  }
-
-  // receive without timeout means wait until a message is delivered.
-  public Message receive() {
-    return this.receive(-1L);
-  }
-
-  public Message receive(long waitTimeInMilliseconds) {
-
-    this.checkIfClosed();
-
-    Message message = this.receiver.receive(waitTimeInMilliseconds);
-
-    if (message != null) {
-      // Let's acknowledge a message although EH service doesn't need it
-      // to avoid AMQP flow issue.
-      receiver.acknowledge(message);
-
-      return message;
-    } else {
-      this.checkError();
-    }
-
-    return null;
-  }
-
-  public void close() {
-    if (!isClosed) {
-      receiver.close();
-      isClosed = true;
-    }
-  }
-
-  private String getConsumerAddress() {
-    return String.format(Constants.ConsumerAddressFormatString,
-        entityPath, consumerGroupName, partitionId);
-  }
-
-  private void ensureReceiverCreated() throws EventHubException {
-    try {
-      logger.info("defaultCredits: " + defaultCredits);
-      receiver = session.createReceiver(consumerAddress,
-          AcknowledgeMode.ALO, linkName, false, filters, null);
-      receiver.setCredit(UnsignedInteger.valueOf(defaultCredits), true);
-    } catch (ConnectionErrorException e) {
-      // caller (EventHubSpout) will log the error
-      throw new EventHubException(e);
-    }
-  }
-
-  private void checkError() {
-    org.apache.qpid.amqp_1_0.type.transport.Error error = this.receiver.getError();
-    if (error != null) {
-      String errorMessage = error.toString();
-      logger.error(errorMessage);
-      this.close();
-
-      throw new RuntimeException(errorMessage);
-    } else {
-      // adding a sleep here to avoid any potential tight-loop issue.
-      try {
-        Thread.sleep(10);
-      } catch (InterruptedException e) {
-        logger.error(e.toString());
-      }
-    }
-  }
-  
-  private void checkIfClosed() {
-    if (this.isClosed) {
-      throw new RuntimeException("receiver was closed.");
-    }
-  }
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSendClient.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSendClient.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSendClient.java
deleted file mode 100755
index ad31cc1..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSendClient.java
+++ /dev/null
@@ -1,70 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import org.apache.storm.eventhubs.spout.EventHubSpoutConfig;
-
-public class EventHubSendClient {
-  
-  public static void main(String[] args) throws Exception {
-    
-    if (args == null || args.length < 7) {
-      throw new IllegalArgumentException(
-        "arguments are missing. [username] [password] [namespace] [entityPath] [partitionId] [messageSize] [messageCount] are required.");
-    }
-    
-    String username = args[0];
-    String password = args[1];
-    String namespace = args[2];
-    String entityPath = args[3];
-    String partitionId = args[4];
-    int messageSize = Integer.parseInt(args[5]);
-    int messageCount = Integer.parseInt(args[6]);
-    assert(messageSize > 0);
-    assert(messageCount > 0);
-    
-    if (partitionId.equals("-1")) {
-      // -1 means we want to send data to partitions in round-robin fashion.
-      partitionId = null;
-    }
-    
-    try {
-      String connectionString = EventHubSpoutConfig.buildConnectionString(username, password, namespace);
-      EventHubClient client = EventHubClient.create(connectionString, entityPath);
-      EventHubSender sender = client.createPartitionSender(partitionId);
-      
-      StringBuilder sb = new StringBuilder(messageSize);
-      for(int i=1; i<messageCount+1; ++i) {
-        while(sb.length() < messageSize) {
-          sb.append(" current message: " + i);
-        }
-        sb.setLength(messageSize);
-        sender.send(sb.toString());
-        sb.setLength(0);
-        if(i % 1000 == 0) {
-          System.out.println("Number of messages sent: " + i);
-        }
-      }
-      System.out.println("Total Number of messages sent: " + messageCount);
-    } catch (Exception e) {
-      System.out.println("Exception: " + e.getMessage());
-    }
-    
-    System.out.println("done");
-  }
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
deleted file mode 100755
index 435893e..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
+++ /dev/null
@@ -1,99 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import java.util.concurrent.TimeoutException;
-import org.apache.qpid.amqp_1_0.client.LinkDetachedException;
-import org.apache.qpid.amqp_1_0.client.Message;
-import org.apache.qpid.amqp_1_0.client.Sender;
-import org.apache.qpid.amqp_1_0.client.Session;
-import org.apache.qpid.amqp_1_0.type.Binary;
-import org.apache.qpid.amqp_1_0.type.messaging.Data;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-public class EventHubSender {
-
-  private static final Logger logger = LoggerFactory.getLogger(EventHubSender.class);
-
-  private final Session session;
-  private final String entityPath;
-  private final String partitionId;
-  private final String destinationAddress;
-
-  private Sender sender;
-
-  public EventHubSender(Session session, String entityPath, String partitionId) {
-    this.session = session;
-    this.entityPath = entityPath;
-    this.partitionId = partitionId;
-    this.destinationAddress = this.getDestinationAddress();
-  }
-  
-  public void send(byte[] data) throws EventHubException {
-    try {
-      if (this.sender == null) {
-        this.ensureSenderCreated();
-      }
-
-      Binary bin = new Binary(data);
-      Message message = new Message(new Data(bin));
-      this.sender.send(message);
-
-    } catch (LinkDetachedException e) {
-      logger.error(e.getMessage());
-
-      EventHubException eventHubException = new EventHubException("Sender has been closed");
-      throw eventHubException;
-    } catch (TimeoutException e) {
-      logger.error(e.getMessage());
-
-      EventHubException eventHubException = new EventHubException("Timed out while waiting to get credit to send");
-      throw eventHubException;
-    } catch (Exception e) {
-      logger.error(e.getMessage());
-    }
-  }
-
-  public void send(String data) throws EventHubException {
-    //For interop with other language, convert string to bytes
-    send(data.getBytes());
-  }
-
-  public void close() {
-    try {
-      this.sender.close();
-    } catch (Sender.SenderClosingException e) {
-      logger.error("Closing a sender encountered error: " + e.getMessage());
-    }
-  }
-
-  private String getDestinationAddress() {
-    if (this.partitionId == null || this.partitionId.equals("")) {
-      return this.entityPath;
-    } else {
-      return String.format(Constants.DestinationAddressFormatString, this.entityPath, this.partitionId);
-    }
-  }
-
-  private synchronized void ensureSenderCreated() throws Exception {
-    if (this.sender == null) {
-      this.sender = this.session.createSender(this.destinationAddress);
-    }
-  }
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/SelectorFilter.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/SelectorFilter.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/SelectorFilter.java
deleted file mode 100755
index 7869cce..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/SelectorFilter.java
+++ /dev/null
@@ -1,38 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import org.apache.qpid.amqp_1_0.type.messaging.Filter;
-
-public class SelectorFilter implements Filter {
-
-  private final String value;
-
-  public SelectorFilter(String value) {
-    this.value = value;
-  }
-
-  public String getValue() {
-    return value;
-  }
-
-  @Override
-  public String toString() {
-    return value;
-  }
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/SelectorFilterWriter.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/SelectorFilterWriter.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/SelectorFilterWriter.java
deleted file mode 100755
index 102b6b6..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/SelectorFilterWriter.java
+++ /dev/null
@@ -1,64 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import org.apache.qpid.amqp_1_0.codec.AbstractDescribedTypeWriter;
-import org.apache.qpid.amqp_1_0.codec.ValueWriter;
-import org.apache.qpid.amqp_1_0.type.UnsignedLong;
-
-public class SelectorFilterWriter extends
-  AbstractDescribedTypeWriter<SelectorFilter> {
-
-  private static final ValueWriter.Factory<SelectorFilter> FACTORY = new ValueWriter.Factory<SelectorFilter>() {
-
-    @Override
-    public ValueWriter<SelectorFilter> newInstance(ValueWriter.Registry registry) {
-      return new SelectorFilterWriter(registry);
-    }
-  };
-
-  private SelectorFilter value;
-
-  public SelectorFilterWriter(final ValueWriter.Registry registry) {
-    super(registry);
-  }
-
-  public static void register(ValueWriter.Registry registry) {
-    registry.register(SelectorFilter.class, FACTORY);
-  }
-
-  @Override
-  protected void onSetValue(final SelectorFilter value) {
-    this.value = value;
-  }
-
-  @Override
-  protected void clear() {
-    value = null;
-  }
-
-  @Override
-  protected Object getDescriptor() {
-    return UnsignedLong.valueOf(0x00000137000000AL);
-  }
-
-  @Override
-  protected ValueWriter<String> createDescribedWriter() {
-    return getRegistry().getValueWriter(value.getValue());
-  }
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverFilter.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverFilter.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverFilter.java
deleted file mode 100755
index e80cd25..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverFilter.java
+++ /dev/null
@@ -1,56 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.spout;
-
-
-public class EventHubReceiverFilter implements IEventHubReceiverFilter {
-  String offset = null;
-  long enqueueTime = 0;
-  public EventHubReceiverFilter() {
-    
-  }
-  
-  public EventHubReceiverFilter(String offset) {
-    //Creates offset only filter
-    this.offset = offset;
-  }
-  
-  public EventHubReceiverFilter(long enqueueTime) {
-    //Creates enqueue time only filter
-    this.enqueueTime = enqueueTime;
-  }
-  
-  public void setOffset(String offset) {
-    this.offset = offset;
-  }
-  
-  public void setEnqueueTime(long enqueueTime) {
-    this.enqueueTime = enqueueTime;
-  }
-  
-  @Override
-  public String getOffset() {
-    return offset;
-  }
-
-  @Override
-  public long getEnqueueTime() {
-    return enqueueTime;
-  }
-
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
index 68302af..7454af4 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
@@ -25,10 +25,10 @@ import backtype.storm.metric.api.CountMetric;
 import backtype.storm.metric.api.MeanReducer;
 import backtype.storm.metric.api.ReducedMetric;
 
-import org.apache.storm.eventhubs.client.Constants;
-import org.apache.storm.eventhubs.client.EventHubClient;
-import org.apache.storm.eventhubs.client.EventHubException;
-import org.apache.storm.eventhubs.client.EventHubReceiver;
+import com.microsoft.eventhubs.client.Constants;
+import com.microsoft.eventhubs.client.EventHubException;
+import com.microsoft.eventhubs.client.IEventHubFilter;
+import com.microsoft.eventhubs.client.ResilientEventHubReceiver;
 
 import java.util.HashMap;
 import java.util.Map;
@@ -39,8 +39,8 @@ import org.apache.qpid.amqp_1_0.type.messaging.MessageAnnotations;
 
 public class EventHubReceiverImpl implements IEventHubReceiver {
   private static final Logger logger = LoggerFactory.getLogger(EventHubReceiverImpl.class);
-  private static final Symbol OffsetKey = Symbol.valueOf("x-opt-offset");
-  private static final Symbol SequenceNumberKey = Symbol.valueOf("x-opt-sequence-number");
+  private static final Symbol OffsetKey = Symbol.valueOf(Constants.OffsetKey);
+  private static final Symbol SequenceNumberKey = Symbol.valueOf(Constants.SequenceNumberKey);
 
   private final String connectionString;
   private final String entityName;
@@ -48,8 +48,7 @@ public class EventHubReceiverImpl implements IEventHubReceiver {
   private final int defaultCredits;
   private final String consumerGroupName;
 
-  private EventHubReceiver receiver;
-  private String lastOffset = null;
+  private ResilientEventHubReceiver receiver;
   private ReducedMetric receiveApiLatencyMean;
   private CountMetric receiveApiCallCount;
   private CountMetric receiveMessageCount;
@@ -66,27 +65,13 @@ public class EventHubReceiverImpl implements IEventHubReceiver {
   }
 
   @Override
-  public void open(IEventHubReceiverFilter filter) throws EventHubException {
-    logger.info("creating eventhub receiver: partitionId=" + partitionId + ", offset=" + filter.getOffset()
-        + ", enqueueTime=" + filter.getEnqueueTime());
+  public void open(IEventHubFilter filter) throws EventHubException {
+    logger.info("creating eventhub receiver: partitionId=" + partitionId + 
+    		", filterString=" + filter.getFilterString());
     long start = System.currentTimeMillis();
-    EventHubClient eventHubClient = EventHubClient.create(connectionString, entityName);
-    if(filter.getOffset() != null) {
-      receiver = eventHubClient
-          .getConsumerGroup(consumerGroupName)
-          .createReceiver(partitionId, filter.getOffset(), defaultCredits);
-    }
-    else if(filter.getEnqueueTime() != 0) {
-      receiver = eventHubClient
-          .getConsumerGroup(consumerGroupName)
-          .createReceiver(partitionId, filter.getEnqueueTime(), defaultCredits);
-    }
-    else {
-      logger.error("Invalid IEventHubReceiverFilter, use default offset as filter");
-      receiver = eventHubClient
-          .getConsumerGroup(consumerGroupName)
-          .createReceiver(partitionId, Constants.DefaultStartingOffset, defaultCredits);
-    }
+    receiver = new ResilientEventHubReceiver(connectionString, entityName,
+    		partitionId, consumerGroupName, defaultCredits, filter);
+    
     long end = System.currentTimeMillis();
     logger.info("created eventhub receiver, time taken(ms): " + (end-start));
   }
@@ -113,21 +98,20 @@ public class EventHubReceiverImpl implements IEventHubReceiver {
     long millis = (end - start);
     receiveApiLatencyMean.update(millis);
     receiveApiCallCount.incr();
-
+    
     if (message == null) {
       //Temporary workaround for AMQP/EH bug of failing to receive messages
-      if(timeoutInMilliseconds > 100 && millis < timeoutInMilliseconds/2) {
+      /*if(timeoutInMilliseconds > 100 && millis < timeoutInMilliseconds/2) {
         throw new RuntimeException(
             "Restart EventHubSpout due to failure of receiving messages in "
             + millis + " millisecond");
-      }
+      }*/
       return null;
     }
+
     receiveMessageCount.incr();
 
-    //logger.info(String.format("received a message. PartitionId: %s, Offset: %s", partitionId, this.lastOffset));
     MessageId messageId = createMessageId(message);
-
     return EventData.create(message, messageId);
   }
   

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java
index 0238e40..77cd998 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubSpoutConfig.java
@@ -18,10 +18,9 @@
 package org.apache.storm.eventhubs.spout;
 
 import java.io.Serializable;
-import java.io.UnsupportedEncodingException;
-import java.net.URLEncoder;
 import java.util.ArrayList;
 import java.util.List;
+import com.microsoft.eventhubs.client.ConnectionStringBuilder;
 
 public class EventHubSpoutConfig implements Serializable {
   private static final long serialVersionUID = 1L; 
@@ -48,7 +47,8 @@ public class EventHubSpoutConfig implements Serializable {
       String entityPath, int partitionCount) {
     this.userName = username;
     this.password = password;
-    this.connectionString = buildConnectionString(username, password, namespace);
+    this.connectionString = new ConnectionStringBuilder(username, password,
+    		namespace).getConnectionString();
     this.namespace = namespace;
     this.entityPath = entityPath;
     this.partitionCount = partitionCount;
@@ -173,28 +173,7 @@ public class EventHubSpoutConfig implements Serializable {
   }
 
   public void setTargetAddress(String targetFqnAddress) {
-    this.connectionString = buildConnectionString(
-        userName, password, namespace, targetFqnAddress);
+    this.connectionString = new ConnectionStringBuilder(userName, password,
+    		namespace, targetFqnAddress).getConnectionString();
   }
-
-  public static String buildConnectionString(String username, String password, String namespace) {
-    return buildConnectionString(username, password, namespace, EH_SERVICE_FQDN_SUFFIX);
-  }
-
-  public static String buildConnectionString(String username, String password,
-      String namespace, String targetFqnAddress) {
-    return "amqps://" + username + ":" + encodeString(password)
-        + "@" + namespace + "." + targetFqnAddress;
-  }	
-
-  private static String encodeString(String input) {
-    try {
-      return URLEncoder.encode(input, "UTF-8");
-    } catch (UnsupportedEncodingException e) {
-      //We don't need to throw this exception because the exception won't
-      //happen because of user input. Our unit tests will catch this error.
-      return "";
-    }
-  }
-
 }

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/IEventHubReceiver.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/IEventHubReceiver.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/IEventHubReceiver.java
index 45e9e57..bc2db14 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/IEventHubReceiver.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/IEventHubReceiver.java
@@ -19,11 +19,12 @@ package org.apache.storm.eventhubs.spout;
 
 import java.util.Map;
 
-import org.apache.storm.eventhubs.client.EventHubException;
+import com.microsoft.eventhubs.client.EventHubException;
+import com.microsoft.eventhubs.client.IEventHubFilter;
 
 public interface IEventHubReceiver {
 
-  void open(IEventHubReceiverFilter filter) throws EventHubException;
+  void open(IEventHubFilter filter) throws EventHubException;
 
   void close();
   

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/IEventHubReceiverFilter.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/IEventHubReceiverFilter.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/IEventHubReceiverFilter.java
deleted file mode 100755
index e5b93cf..0000000
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/IEventHubReceiverFilter.java
+++ /dev/null
@@ -1,35 +0,0 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.spout;
-
-/**
- * The filter to create an EventHubs receiver
- */
-public interface IEventHubReceiverFilter {
-  /**
-   * Get offset to filter events based on offset 
-   * @return null if offset not set
-   */
-  String getOffset();
-  
-  /**
-   * Get timestamp to filter events based on enqueue time.
-   * @return 0 if enqueue time is not set
-   */
-  long getEnqueueTime();
-}

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/SimplePartitionManager.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/SimplePartitionManager.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/SimplePartitionManager.java
index bcbcbac..b66a785 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/SimplePartitionManager.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/SimplePartitionManager.java
@@ -22,7 +22,10 @@ import java.util.Map;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
-import org.apache.storm.eventhubs.client.Constants;
+import com.microsoft.eventhubs.client.Constants;
+import com.microsoft.eventhubs.client.EventHubEnqueueTimeFilter;
+import com.microsoft.eventhubs.client.EventHubOffsetFilter;
+import com.microsoft.eventhubs.client.IEventHubFilter;
 
 /**
  * A simple partition manager that does not re-send failed messages
@@ -62,13 +65,13 @@ public class SimplePartitionManager implements IPartitionManager {
       offset = Constants.DefaultStartingOffset;
     }
 
-    EventHubReceiverFilter filter = new EventHubReceiverFilter();
+    IEventHubFilter filter;
     if (offset.equals(Constants.DefaultStartingOffset)
         && config.getEnqueueTimeFilter() != 0) {
-      filter.setEnqueueTime(config.getEnqueueTimeFilter());
+      filter = new EventHubEnqueueTimeFilter(config.getEnqueueTimeFilter());
     }
     else {
-      filter.setOffset(offset);
+      filter = new EventHubOffsetFilter(offset);
     }
 
     receiver.open(filter);

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/StaticPartitionCoordinator.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/StaticPartitionCoordinator.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/StaticPartitionCoordinator.java
index 3f5f156..8d2c485 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/StaticPartitionCoordinator.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/StaticPartitionCoordinator.java
@@ -25,7 +25,7 @@ import java.util.Map;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
-import org.apache.storm.eventhubs.client.Constants;
+import com.microsoft.eventhubs.client.Constants;
 
 public class StaticPartitionCoordinator implements IPartitionCoordinator {
 

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TransactionalTridentEventHubEmitter.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TransactionalTridentEventHubEmitter.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TransactionalTridentEventHubEmitter.java
index 2b92c3c..bf7f339 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TransactionalTridentEventHubEmitter.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TransactionalTridentEventHubEmitter.java
@@ -29,7 +29,7 @@ import org.apache.storm.eventhubs.spout.EventHubReceiverImpl;
 import org.apache.storm.eventhubs.spout.EventHubSpoutConfig;
 import org.apache.storm.eventhubs.spout.IEventHubReceiver;
 import org.apache.storm.eventhubs.spout.IEventHubReceiverFactory;
-import org.apache.storm.eventhubs.client.Constants;
+import com.microsoft.eventhubs.client.Constants;
 
 import storm.trident.operation.TridentCollector;
 import storm.trident.spout.IOpaquePartitionedTridentSpout;

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TridentPartitionManager.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TridentPartitionManager.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TridentPartitionManager.java
index 60391c3..159fe41 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TridentPartitionManager.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/trident/TridentPartitionManager.java
@@ -23,10 +23,12 @@ import java.util.List;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
-import org.apache.storm.eventhubs.client.Constants;
-import org.apache.storm.eventhubs.client.EventHubException;
+import com.microsoft.eventhubs.client.Constants;
+import com.microsoft.eventhubs.client.EventHubEnqueueTimeFilter;
+import com.microsoft.eventhubs.client.EventHubException;
+import com.microsoft.eventhubs.client.EventHubOffsetFilter;
+
 import org.apache.storm.eventhubs.spout.EventData;
-import org.apache.storm.eventhubs.spout.EventHubReceiverFilter;
 import org.apache.storm.eventhubs.spout.EventHubSpoutConfig;
 import org.apache.storm.eventhubs.spout.IEventHubReceiver;
 
@@ -47,10 +49,10 @@ public class TridentPartitionManager implements ITridentPartitionManager {
     try {
       if((offset == null || offset.equals(Constants.DefaultStartingOffset)) 
         && spoutConfig.getEnqueueTimeFilter() != 0) {
-          receiver.open(new EventHubReceiverFilter(spoutConfig.getEnqueueTimeFilter()));
+          receiver.open(new EventHubEnqueueTimeFilter(spoutConfig.getEnqueueTimeFilter()));
       }
       else {
-        receiver.open(new EventHubReceiverFilter(offset));
+        receiver.open(new EventHubOffsetFilter(offset));
       }
       lastOffset = offset;
       return true;

http://git-wip-us.apache.org/repos/asf/storm/blob/85aeb3d4/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/EventHubReceiverMock.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/EventHubReceiverMock.java b/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/EventHubReceiverMock.java
index 740ef63..b176598 100755
--- a/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/EventHubReceiverMock.java
+++ b/external/storm-eventhubs/src/test/java/org/apache/storm/eventhubs/spout/EventHubReceiverMock.java
@@ -24,14 +24,15 @@ import java.util.Map;
 import org.apache.storm.eventhubs.spout.MessageId;
 import org.apache.storm.eventhubs.spout.EventData;
 import org.apache.storm.eventhubs.spout.IEventHubReceiver;
-
 import org.apache.qpid.amqp_1_0.client.Message;
 import org.apache.qpid.amqp_1_0.jms.impl.TextMessageImpl;
 import org.apache.qpid.amqp_1_0.type.Binary;
 import org.apache.qpid.amqp_1_0.type.Section;
 import org.apache.qpid.amqp_1_0.type.messaging.Data;
 
-import org.apache.storm.eventhubs.client.EventHubException;
+import com.microsoft.eventhubs.client.EventHubException;
+import com.microsoft.eventhubs.client.EventHubOffsetFilter;
+import com.microsoft.eventhubs.client.IEventHubFilter;
 
 /**
  * A mock receiver that emits fake data with offset starting from given offset
@@ -58,17 +59,8 @@ public class EventHubReceiverMock implements IEventHubReceiver {
   }
 
   @Override
-  public void open(IEventHubReceiverFilter filter) throws EventHubException {
-    if(filter.getOffset() != null) {
-      currentOffset = Long.parseLong(filter.getOffset());
-    }
-    else if(filter.getEnqueueTime() != 0) {
-      //assume if it's time based filter the offset matches the enqueue time.
-      currentOffset = filter.getEnqueueTime();
-    }
-    else {
-      throw new EventHubException("Invalid IEventHubReceiverFilter");
-    }
+  public void open(IEventHubFilter filter) throws EventHubException {
+    currentOffset = Long.parseLong(filter.getFilterValue());
     isOpen = true;
   }
 


[26/50] [abbrv] storm git commit: merge flux into external/flux/

Posted by pt...@apache.org.
merge flux into external/flux/


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/b21a98dd
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/b21a98dd
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/b21a98dd

Branch: refs/heads/0.10.x-branch
Commit: b21a98dd87f82a06a6295ab2bfd832c2810ca57e
Parents: ea0fe12 b372a11
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed May 6 13:31:04 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed May 6 13:31:04 2015 -0400

----------------------------------------------------------------------
 external/flux/.gitignore                        |  15 +
 external/flux/LICENSE                           | 202 +++++
 external/flux/README.md                         | 845 +++++++++++++++++++
 external/flux/flux-core/pom.xml                 |  94 +++
 .../main/java/org/apache/storm/flux/Flux.java   | 263 ++++++
 .../java/org/apache/storm/flux/FluxBuilder.java | 591 +++++++++++++
 .../apache/storm/flux/api/TopologySource.java   |  39 +
 .../org/apache/storm/flux/model/BeanDef.java    |  39 +
 .../apache/storm/flux/model/BeanReference.java  |  39 +
 .../org/apache/storm/flux/model/BoltDef.java    |  24 +
 .../storm/flux/model/ConfigMethodDef.java       |  62 ++
 .../storm/flux/model/ExecutionContext.java      |  77 ++
 .../apache/storm/flux/model/GroupingDef.java    |  77 ++
 .../org/apache/storm/flux/model/IncludeDef.java |  54 ++
 .../org/apache/storm/flux/model/ObjectDef.java  |  90 ++
 .../apache/storm/flux/model/PropertyDef.java    |  58 ++
 .../org/apache/storm/flux/model/SpoutDef.java   |  24 +
 .../org/apache/storm/flux/model/StreamDef.java  |  64 ++
 .../apache/storm/flux/model/TopologyDef.java    | 216 +++++
 .../storm/flux/model/TopologySourceDef.java     |  36 +
 .../org/apache/storm/flux/model/VertexDef.java  |  36 +
 .../apache/storm/flux/parser/FluxParser.java    | 202 +++++
 .../flux-core/src/main/resources/splash.txt     |   9 +
 .../org/apache/storm/flux/FluxBuilderTest.java  |  31 +
 .../org/apache/storm/flux/IntegrationTest.java  |  41 +
 .../java/org/apache/storm/flux/TCKTest.java     | 234 +++++
 .../multilang/MultilangEnvirontmentTest.java    |  89 ++
 .../apache/storm/flux/test/SimpleTopology.java  |  42 +
 .../storm/flux/test/SimpleTopologySource.java   |  35 +
 .../test/SimpleTopologyWithConfigParam.java     |  38 +
 .../org/apache/storm/flux/test/TestBolt.java    |  63 ++
 .../storm/flux/test/TridentTopologySource.java  |  54 ++
 .../src/test/resources/configs/bad_hbase.yaml   |  98 +++
 .../resources/configs/config-methods-test.yaml  |  70 ++
 .../existing-topology-method-override.yaml      |  10 +
 .../existing-topology-reflection-config.yaml    |   9 +
 .../configs/existing-topology-reflection.yaml   |   9 +
 .../configs/existing-topology-trident.yaml      |   9 +
 .../resources/configs/existing-topology.yaml    |   8 +
 .../src/test/resources/configs/hdfs_test.yaml   |  97 +++
 .../test/resources/configs/include_test.yaml    |  25 +
 .../configs/invalid-existing-topology.yaml      |  17 +
 .../src/test/resources/configs/kafka_test.yaml  | 126 +++
 .../src/test/resources/configs/shell_test.yaml  | 104 +++
 .../test/resources/configs/simple_hbase.yaml    | 120 +++
 .../resources/configs/substitution-test.yaml    | 106 +++
 .../src/test/resources/configs/tck.yaml         |  95 +++
 .../src/test/resources/configs/test.properties  |   2 +
 .../flux-core/src/test/resources/logback.xml    |  30 +
 external/flux/flux-examples/README.md           |  68 ++
 external/flux/flux-examples/pom.xml             |  87 ++
 .../storm/flux/examples/WordCountClient.java    |  74 ++
 .../apache/storm/flux/examples/WordCounter.java |  71 ++
 .../src/main/resources/hbase_bolt.properties    |  18 +
 .../src/main/resources/hdfs_bolt.properties     |  26 +
 .../src/main/resources/kafka_spout.yaml         | 136 +++
 .../src/main/resources/multilang.yaml           |  89 ++
 .../src/main/resources/simple_hbase.yaml        |  92 ++
 .../src/main/resources/simple_hdfs.yaml         | 105 +++
 .../src/main/resources/simple_wordcount.yaml    |  68 ++
 external/flux/flux-ui/README.md                 |   3 +
 external/flux/flux-wrappers/pom.xml             |  35 +
 .../flux/wrappers/bolts/FluxShellBolt.java      |  56 ++
 .../storm/flux/wrappers/bolts/LogInfoBolt.java  |  44 +
 .../flux/wrappers/spouts/FluxShellSpout.java    |  55 ++
 .../main/resources/resources/randomsentence.js  |  93 ++
 .../main/resources/resources/splitsentence.py   |  24 +
 .../src/main/resources/resources/storm.js       | 373 ++++++++
 .../src/main/resources/resources/storm.py       | 260 ++++++
 external/flux/pom.xml                           | 126 +++
 70 files changed, 6621 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/.gitignore
----------------------------------------------------------------------
diff --cc external/flux/.gitignore
index 0000000,0000000..35fb1db
new file mode 100644
--- /dev/null
+++ b/external/flux/.gitignore
@@@ -1,0 -1,0 +1,15 @@@
++*.class
++**/target
++
++# Package Files #
++*.jar
++*.war
++*.ear
++
++# Intellij
++**/*.iml
++**/*.ipr
++**/*.iws
++
++# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
++hs_err_pid*

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/LICENSE
----------------------------------------------------------------------
diff --cc external/flux/LICENSE
index 0000000,0000000..e06d208
new file mode 100644
--- /dev/null
+++ b/external/flux/LICENSE
@@@ -1,0 -1,0 +1,202 @@@
++Apache License
++                           Version 2.0, January 2004
++                        http://www.apache.org/licenses/
++
++   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
++
++   1. Definitions.
++
++      "License" shall mean the terms and conditions for use, reproduction,
++      and distribution as defined by Sections 1 through 9 of this document.
++
++      "Licensor" shall mean the copyright owner or entity authorized by
++      the copyright owner that is granting the License.
++
++      "Legal Entity" shall mean the union of the acting entity and all
++      other entities that control, are controlled by, or are under common
++      control with that entity. For the purposes of this definition,
++      "control" means (i) the power, direct or indirect, to cause the
++      direction or management of such entity, whether by contract or
++      otherwise, or (ii) ownership of fifty percent (50%) or more of the
++      outstanding shares, or (iii) beneficial ownership of such entity.
++
++      "You" (or "Your") shall mean an individual or Legal Entity
++      exercising permissions granted by this License.
++
++      "Source" form shall mean the preferred form for making modifications,
++      including but not limited to software source code, documentation
++      source, and configuration files.
++
++      "Object" form shall mean any form resulting from mechanical
++      transformation or translation of a Source form, including but
++      not limited to compiled object code, generated documentation,
++      and conversions to other media types.
++
++      "Work" shall mean the work of authorship, whether in Source or
++      Object form, made available under the License, as indicated by a
++      copyright notice that is included in or attached to the work
++      (an example is provided in the Appendix below).
++
++      "Derivative Works" shall mean any work, whether in Source or Object
++      form, that is based on (or derived from) the Work and for which the
++      editorial revisions, annotations, elaborations, or other modifications
++      represent, as a whole, an original work of authorship. For the purposes
++      of this License, Derivative Works shall not include works that remain
++      separable from, or merely link (or bind by name) to the interfaces of,
++      the Work and Derivative Works thereof.
++
++      "Contribution" shall mean any work of authorship, including
++      the original version of the Work and any modifications or additions
++      to that Work or Derivative Works thereof, that is intentionally
++      submitted to Licensor for inclusion in the Work by the copyright owner
++      or by an individual or Legal Entity authorized to submit on behalf of
++      the copyright owner. For the purposes of this definition, "submitted"
++      means any form of electronic, verbal, or written communication sent
++      to the Licensor or its representatives, including but not limited to
++      communication on electronic mailing lists, source code control systems,
++      and issue tracking systems that are managed by, or on behalf of, the
++      Licensor for the purpose of discussing and improving the Work, but
++      excluding communication that is conspicuously marked or otherwise
++      designated in writing by the copyright owner as "Not a Contribution."
++
++      "Contributor" shall mean Licensor and any individual or Legal Entity
++      on behalf of whom a Contribution has been received by Licensor and
++      subsequently incorporated within the Work.
++
++   2. Grant of Copyright License. Subject to the terms and conditions of
++      this License, each Contributor hereby grants to You a perpetual,
++      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
++      copyright license to reproduce, prepare Derivative Works of,
++      publicly display, publicly perform, sublicense, and distribute the
++      Work and such Derivative Works in Source or Object form.
++
++   3. Grant of Patent License. Subject to the terms and conditions of
++      this License, each Contributor hereby grants to You a perpetual,
++      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
++      (except as stated in this section) patent license to make, have made,
++      use, offer to sell, sell, import, and otherwise transfer the Work,
++      where such license applies only to those patent claims licensable
++      by such Contributor that are necessarily infringed by their
++      Contribution(s) alone or by combination of their Contribution(s)
++      with the Work to which such Contribution(s) was submitted. If You
++      institute patent litigation against any entity (including a
++      cross-claim or counterclaim in a lawsuit) alleging that the Work
++      or a Contribution incorporated within the Work constitutes direct
++      or contributory patent infringement, then any patent licenses
++      granted to You under this License for that Work shall terminate
++      as of the date such litigation is filed.
++
++   4. Redistribution. You may reproduce and distribute copies of the
++      Work or Derivative Works thereof in any medium, with or without
++      modifications, and in Source or Object form, provided that You
++      meet the following conditions:
++
++      (a) You must give any other recipients of the Work or
++          Derivative Works a copy of this License; and
++
++      (b) You must cause any modified files to carry prominent notices
++          stating that You changed the files; and
++
++      (c) You must retain, in the Source form of any Derivative Works
++          that You distribute, all copyright, patent, trademark, and
++          attribution notices from the Source form of the Work,
++          excluding those notices that do not pertain to any part of
++          the Derivative Works; and
++
++      (d) If the Work includes a "NOTICE" text file as part of its
++          distribution, then any Derivative Works that You distribute must
++          include a readable copy of the attribution notices contained
++          within such NOTICE file, excluding those notices that do not
++          pertain to any part of the Derivative Works, in at least one
++          of the following places: within a NOTICE text file distributed
++          as part of the Derivative Works; within the Source form or
++          documentation, if provided along with the Derivative Works; or,
++          within a display generated by the Derivative Works, if and
++          wherever such third-party notices normally appear. The contents
++          of the NOTICE file are for informational purposes only and
++          do not modify the License. You may add Your own attribution
++          notices within Derivative Works that You distribute, alongside
++          or as an addendum to the NOTICE text from the Work, provided
++          that such additional attribution notices cannot be construed
++          as modifying the License.
++
++      You may add Your own copyright statement to Your modifications and
++      may provide additional or different license terms and conditions
++      for use, reproduction, or distribution of Your modifications, or
++      for any such Derivative Works as a whole, provided Your use,
++      reproduction, and distribution of the Work otherwise complies with
++      the conditions stated in this License.
++
++   5. Submission of Contributions. Unless You explicitly state otherwise,
++      any Contribution intentionally submitted for inclusion in the Work
++      by You to the Licensor shall be under the terms and conditions of
++      this License, without any additional terms or conditions.
++      Notwithstanding the above, nothing herein shall supersede or modify
++      the terms of any separate license agreement you may have executed
++      with Licensor regarding such Contributions.
++
++   6. Trademarks. This License does not grant permission to use the trade
++      names, trademarks, service marks, or product names of the Licensor,
++      except as required for reasonable and customary use in describing the
++      origin of the Work and reproducing the content of the NOTICE file.
++
++   7. Disclaimer of Warranty. Unless required by applicable law or
++      agreed to in writing, Licensor provides the Work (and each
++      Contributor provides its Contributions) on an "AS IS" BASIS,
++      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
++      implied, including, without limitation, any warranties or conditions
++      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
++      PARTICULAR PURPOSE. You are solely responsible for determining the
++      appropriateness of using or redistributing the Work and assume any
++      risks associated with Your exercise of permissions under this License.
++
++   8. Limitation of Liability. In no event and under no legal theory,
++      whether in tort (including negligence), contract, or otherwise,
++      unless required by applicable law (such as deliberate and grossly
++      negligent acts) or agreed to in writing, shall any Contributor be
++      liable to You for damages, including any direct, indirect, special,
++      incidental, or consequential damages of any character arising as a
++      result of this License or out of the use or inability to use the
++      Work (including but not limited to damages for loss of goodwill,
++      work stoppage, computer failure or malfunction, or any and all
++      other commercial damages or losses), even if such Contributor
++      has been advised of the possibility of such damages.
++
++   9. Accepting Warranty or Additional Liability. While redistributing
++      the Work or Derivative Works thereof, You may choose to offer,
++      and charge a fee for, acceptance of support, warranty, indemnity,
++      or other liability obligations and/or rights consistent with this
++      License. However, in accepting such obligations, You may act only
++      on Your own behalf and on Your sole responsibility, not on behalf
++      of any other Contributor, and only if You agree to indemnify,
++      defend, and hold each Contributor harmless for any liability
++      incurred by, or claims asserted against, such Contributor by reason
++      of your accepting any such warranty or additional liability.
++
++   END OF TERMS AND CONDITIONS
++
++   APPENDIX: How to apply the Apache License to your work.
++
++      To apply the Apache License to your work, attach the following
++      boilerplate notice, with the fields enclosed by brackets "{}"
++      replaced with your own identifying information. (Don't include
++      the brackets!)  The text should be enclosed in the appropriate
++      comment syntax for the file format. We also recommend that a
++      file or class name and description of purpose be included on the
++      same "printed page" as the copyright notice for easier
++      identification within third-party archives.
++
++   Copyright {yyyy} {name of copyright owner}
++
++   Licensed under the Apache License, Version 2.0 (the "License");
++   you may not use this file except in compliance with the License.
++   You may obtain a copy of the License at
++
++       http://www.apache.org/licenses/LICENSE-2.0
++
++   Unless required by applicable law or agreed to in writing, software
++   distributed under the License is distributed on an "AS IS" BASIS,
++   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++   See the License for the specific language governing permissions and
++   limitations under the License.
++

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/README.md
----------------------------------------------------------------------
diff --cc external/flux/README.md
index 0000000,0000000..6f27219
new file mode 100644
--- /dev/null
+++ b/external/flux/README.md
@@@ -1,0 -1,0 +1,845 @@@
++# flux
++A framework for creating and deploying Apache Storm streaming computations with less friction.
++
++## Definition
++**flux** |fləks| _noun_
++
++1. The action or process of flowing or flowing out
++2. Continuous change
++3. In physics, the rate of flow of a fluid, radiant energy, or particles across a given area
++4. A substance mixed with a solid to lower its melting point
++
++## Rationale
++Bad things happen when configuration is hard-coded. No one should have to recompile or repackage an application in
++order to change configuration.
++
++## About
++Flux is a framework and set of utilities that make defining and deploying Apache Storm topologies less painful and
++deveoper-intensive.
++
++Have you ever found yourself repeating this pattern?:
++
++```java
++
++public static void main(String[] args) throws Exception {
++    // logic to determine if we're running locally or not...
++    // create necessary config options...
++    boolean runLocal = shouldRunLocal();
++    if(runLocal){
++        LocalCluster cluster = new LocalCluster();
++        cluster.submitTopology(name, conf, topology);
++    } else {
++        StormSubmitter.submitTopology(name, conf, topology);
++    }
++}
++```
++
++Wouldn't something like this be easier:
++
++```bash
++storm jar mytopology.jar org.apache.storm.flux.Flux --local config.yaml
++```
++
++or:
++
++```bash
++storm jar mytopology.jar org.apache.storm.flux.Flux --remote config.yaml
++```
++
++Another pain point often mentioned is the fact that the wiring for a Topology graph is often tied up in Java code,
++and that any changes require recompilation and repackaging of the topology jar file. Flux aims to alleviate that
++pain by allowing you to package all your Storm components in a single jar, and use an external text file to define
++the layout and configuration of your topologies.
++
++## Features
++
++ * Easily configure and deploy Storm topologies (Both Storm core and Microbatch API) without embedding configuration
++   in your topology code
++ * Support for existing topology code (see below)
++ * Define Storm Core API (Spouts/Bolts) using a flexible YAML DSL
++ * YAML DSL support for most Storm components (storm-kafka, storm-hdfs, storm-hbase, etc.)
++ * Convenient support for multi-lang components
++ * External property substitution/filtering for easily switching between configurations/environments (similar to Maven-style
++   `${variable.name}` substitution)
++
++## Usage
++
++To use Flux, add it as a dependency and package all your Storm components in a fat jar, then create a YAML document
++to define your topology (see below for YAML configuration options).
++
++### Building from Source
++The easiest way to use Flux, is to add it as a Maven dependency in you project as described below.
++
++If you would like to build Flux from source and run the unit/integration tests, you will need the following installed
++on your system:
++
++* Python 2.6.x or later
++* Node.js 0.10.x or later
++
++#### Building with unit tests enabled:
++
++```
++mvn clean install
++```
++
++#### Building with unit tests disabled:
++If you would like to build Flux without installing Python or Node.js you can simply skip the unit tests:
++
++```
++mvn clean install -DskipTests=true
++```
++
++Note that if you plan on using Flux to deploy topologies to a remote cluster, you will still need to have Python
++installed since it is required by Apache Storm.
++
++
++#### Building with integration tests enabled:
++
++```
++mvn clean install -DskipIntegration=false
++```
++
++
++### Packaging with Maven
++To enable Flux for your Storm components, you need to add it as a dependency such that it's included in the Storm
++topology jar. This can be accomplished with the Maven shade plugin (preferred) or the Maven assembly plugin (not
++recommended).
++
++#### Flux Maven Dependency
++The current version of Flux is available in Maven Central at the following coordinates:
++```xml
++<dependency>
++    <groupId>com.github.ptgoetz</groupId>
++    <artifactId>flux-core</artifactId>
++    <version>0.3.0</version>
++</dependency>
++```
++
++#### Creating a Flux-Enabled Topology JAR
++The example below illustrates Flux usage with the Maven shade plugin:
++
++ ```xml
++<!-- include Flux and user dependencies in the shaded jar -->
++<dependencies>
++    <!-- Flux include -->
++    <dependency>
++        <groupId>com.github.ptgoetz</groupId>
++        <artifactId>flux-core</artifactId>
++        <version>0.3.0</version>
++    </dependency>
++
++    <!-- add user dependencies here... -->
++
++</dependencies>
++<!-- create a fat jar that includes all dependencies -->
++<build>
++    <plugins>
++        <plugin>
++            <groupId>org.apache.maven.plugins</groupId>
++            <artifactId>maven-shade-plugin</artifactId>
++            <version>1.4</version>
++            <configuration>
++                <createDependencyReducedPom>true</createDependencyReducedPom>
++            </configuration>
++            <executions>
++                <execution>
++                    <phase>package</phase>
++                    <goals>
++                        <goal>shade</goal>
++                    </goals>
++                    <configuration>
++                        <transformers>
++                            <transformer
++                                    implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
++                            <transformer
++                                    implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
++                                <mainClass>org.apache.storm.flux.Flux</mainClass>
++                            </transformer>
++                        </transformers>
++                    </configuration>
++                </execution>
++            </executions>
++        </plugin>
++    </plugins>
++</build>
++ ```
++
++### Deploying and Running a Flux Topology
++Once your topology components are packaged with the Flux dependency, you can run different topologies either locally
++or remotely using the `storm jar` command. For example, if your fat jar is named `myTopology-0.1.0-SNAPSHOT.jar` you
++could run it locally with the command:
++
++
++```bash
++storm jar myTopology-0.1.0-SNAPSHOT.jar org.apache.storm.flux.Flux --local my_config.yaml
++
++```
++
++### Command line options
++```
++usage: storm jar <my_topology_uber_jar.jar> org.apache.storm.flux.Flux
++             [options] <topology-config.yaml>
++ -d,--dry-run                 Do not run or deploy the topology. Just
++                              build, validate, and print information about
++                              the topology.
++ -e,--env-filter              Perform environment variable substitution.
++                              Replace keysidentified with `${ENV-[NAME]}`
++                              will be replaced with the corresponding
++                              `NAME` environment value
++ -f,--filter <file>           Perform property substitution. Use the
++                              specified file as a source of properties,
++                              and replace keys identified with {$[property
++                              name]} with the value defined in the
++                              properties file.
++ -i,--inactive                Deploy the topology, but do not activate it.
++ -l,--local                   Run the topology in local mode.
++ -n,--no-splash               Suppress the printing of the splash screen.
++ -q,--no-detail               Suppress the printing of topology details.
++ -r,--remote                  Deploy the topology to a remote cluster.
++ -R,--resource                Treat the supplied path as a classpath
++                              resource instead of a file.
++ -s,--sleep <ms>              When running locally, the amount of time to
++                              sleep (in ms.) before killing the topology
++                              and shutting down the local cluster.
++ -z,--zookeeper <host:port>   When running in local mode, use the
++                              ZooKeeper at the specified <host>:<port>
++                              instead of the in-process ZooKeeper.
++                              (requires Storm 0.9.3 or later)
++```
++
++**NOTE:** Flux tries to avoid command line switch collision with the `storm` command, and allows any other command line
++switches to pass through to the `storm` command.
++
++For example, you can use the `storm` command switch `-c` to override a topology configuration property. The following
++example command will run Flux and override the `nimus.host` configuration:
++
++```bash
++storm jar myTopology-0.1.0-SNAPSHOT.jar org.apache.storm.flux.Flux --remote my_config.yaml -c nimbus.host=localhost
++```
++
++### Sample output
++```
++███████╗██╗     ██╗   ██╗██╗  ██╗
++██╔════╝██║     ██║   ██║╚██╗██╔╝
++█████╗  ██║     ██║   ██║ ╚███╔╝
++██╔══╝  ██║     ██║   ██║ ██╔██╗
++██║     ███████╗╚██████╔╝██╔╝ ██╗
++╚═╝     ╚══════╝ ╚═════╝ ╚═╝  ╚═╝
+++-         Apache Storm        -+
+++-  data FLow User eXperience  -+
++Version: 0.3.0
++Parsing file: /Users/hsimpson/Projects/donut_domination/storm/shell_test.yaml
++---------- TOPOLOGY DETAILS ----------
++Name: shell-topology
++--------------- SPOUTS ---------------
++sentence-spout[1](org.apache.storm.flux.spouts.GenericShellSpout)
++---------------- BOLTS ---------------
++splitsentence[1](org.apache.storm.flux.bolts.GenericShellBolt)
++log[1](org.apache.storm.flux.wrappers.bolts.LogInfoBolt)
++count[1](backtype.storm.testing.TestWordCounter)
++--------------- STREAMS ---------------
++sentence-spout --SHUFFLE--> splitsentence
++splitsentence --FIELDS--> count
++count --SHUFFLE--> log
++--------------------------------------
++Submitting topology: 'shell-topology' to remote cluster...
++```
++
++## YAML Configuration
++Flux topologies are defined in a YAML file that describes a topology. A Flux topology
++definition consists of the following:
++
++  1. A topology name
++  2. A list of topology "components" (named Java objects that will be made available in the environment)
++  3. **EITHER** (A DSL topology definition):
++      * A list of spouts, each identified by a unique ID
++      * A list of bolts, each identified by a unique ID
++      * A list of "stream" objects representing a flow of tuples between spouts and bolts
++  4. **OR** (A JVM class that can produce a `backtype.storm.generated.StormTopology` instance:
++      * A `topologySource` definition.
++
++
++
++For example, here is a simple definition of a wordcount topology using the YAML DSL:
++
++```yaml
++name: "yaml-topology"
++config:
++  topology.workers: 1
++
++# spout definitions
++spouts:
++  - id: "spout-1"
++    className: "backtype.storm.testing.TestWordSpout"
++    parallelism: 1
++
++# bolt definitions
++bolts:
++  - id: "bolt-1"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++  - id: "bolt-2"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++
++#stream definitions
++streams:
++  - name: "spout-1 --> bolt-1" # name isn't used (placeholder for logging, UI, etc.)
++    from: "spout-1"
++    to: "bolt-1"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "bolt-1 --> bolt2"
++    from: "bolt-1"
++    to: "bolt-2"
++    grouping:
++      type: SHUFFLE
++
++
++```
++## Property Substitution/Filtering
++It's common for developers to want to easily switch between configurations, for example switching deployment between
++a development environment and a production environment. This can be accomplished by using separate YAML configuration
++files, but that approach would lead to unnecessary duplication, especially in situations where the Storm topology
++does not change, but configuration settings such as host names, ports, and parallelism paramters do.
++
++For this case, Flux offers properties filtering to allow you two externalize values to a `.properties` file and have
++them substituted before the `.yaml` file is parsed.
++
++To enable property filtering, use the `--filter` command line option and specify a `.properties` file. For example,
++if you invoked flux like so:
++
++```bash
++storm jar myTopology-0.1.0-SNAPSHOT.jar org.apache.storm.flux.Flux --local my_config.yaml --filter dev.properties
++```
++With the following `dev.properties` file:
++
++```properties
++kafka.zookeeper.hosts: localhost:2181
++```
++
++You would then be able to reference those properties by key in your `.yaml` file using `${}` syntax:
++
++```yaml
++  - id: "zkHosts"
++    className: "storm.kafka.ZkHosts"
++    constructorArgs:
++      - "${kafka.zookeeper.hosts}"
++```
++
++In this case, Flux would replace `${kafka.zookeeper.hosts}` with `localhost:2181` before parsing the YAML contents.
++
++### Environment Variable Substitution/Filtering
++Flux also allows environment variable substitution. For example, if an environment variable named `ZK_HOSTS` if defined,
++you can reference it in a Flux YAML file with the following syntax:
++
++```
++${ENV-ZK_HOSTS}
++```
++
++## Components
++Components are essentially named object instances that are made available as configuration options for spouts and
++bolts. If you are familiar with the Spring framework, components are roughly analagous to Spring beans.
++
++Every component is identified, at a minimum, by a unique identifier (String) and a class name (String). For example,
++the following will make an instance of the `storm.kafka.StringScheme` class available as a reference under the key
++`"stringScheme"` . This assumes the `storm.kafka.StringScheme` has a default constructor.
++
++```yaml
++components:
++  - id: "stringScheme"
++    className: "storm.kafka.StringScheme"
++```
++
++### Contructor Arguments, References, Properties and Configuration Methods
++
++####Constructor Arguments
++Arguments to a class constructor can be configured by adding a `contructorArgs` element to a components.
++`constructorArgs` is a list of objects that will be passed to the class' constructor. The following example creates an
++object by calling the constructor that takes a single string as an argument:
++
++```yaml
++  - id: "zkHosts"
++    className: "storm.kafka.ZkHosts"
++    constructorArgs:
++      - "localhost:2181"
++```
++
++####References
++Each component instance is identified by a unique id that allows it to be used/reused by other components. To
++reference an existing component, you specify the id of the component with the `ref` tag.
++
++In the following example, a component with the id `"stringScheme"` is created, and later referenced, as a an argument
++to another component's constructor:
++
++```yaml
++components:
++  - id: "stringScheme"
++    className: "storm.kafka.StringScheme"
++
++  - id: "stringMultiScheme"
++    className: "backtype.storm.spout.SchemeAsMultiScheme"
++    constructorArgs:
++      - ref: "stringScheme" # component with id "stringScheme" must be declared above.
++```
++**N.B.:** References can only be used after (below) the object they point to has been declared.
++
++####Properties
++In addition to calling constructors with different arguments, Flux also allows you to configure components using
++JavaBean-like setter methods and fields declared as `public`:
++
++```yaml
++  - id: "spoutConfig"
++    className: "storm.kafka.SpoutConfig"
++    constructorArgs:
++      # brokerHosts
++      - ref: "zkHosts"
++      # topic
++      - "myKafkaTopic"
++      # zkRoot
++      - "/kafkaSpout"
++      # id
++      - "myId"
++    properties:
++      - name: "forceFromStart"
++        value: true
++      - name: "scheme"
++        ref: "stringMultiScheme"
++```
++
++In the example above, the `properties` declaration will cause Flux to look for a public method in the `SpoutConfig` with
++the signature `setForceFromStart(boolean b)` and attempt to invoke it. If a setter method is not found, Flux will then
++look for a public instance variable with the name `forceFromStart` and attempt to set its value.
++
++References may also be used as property values.
++
++####Configuration Methods
++Conceptually, configuration methods are similar to Properties and Constructor Args -- they allow you to invoke an
++arbitrary method on an object after it is constructed. Configuration methods are useful for working with classes that
++don't expose JavaBean methods or have constructors that can fully configure the object. Common examples include classes
++that use the builder pattern for configuration/composition.
++
++The following YAML example creates a bolt and configures it by calling several methods:
++
++```yaml
++bolts:
++  - id: "bolt-1"
++    className: "org.apache.storm.flux.test.TestBolt"
++    parallelism: 1
++    configMethods:
++      - name: "withFoo"
++        args:
++          - "foo"
++      - name: "withBar"
++        args:
++          - "bar"
++      - name: "withFooBar"
++        args:
++          - "foo"
++          - "bar"
++```
++
++The signatures of the corresponding methods are as follows:
++
++```java
++    public void withFoo(String foo);
++    public void withBar(String bar);
++    public void withFooBar(String foo, String bar);
++```
++
++Arguments passed to configuration methods work much the same way as constructor arguments, and support references as
++well.
++
++### Using Java `enum`s in Contructor Arguments, References, Properties and Configuration Methods
++You can easily use Java `enum` values as arguments in a Flux YAML file, simply by referencing the name of the `enum`.
++
++For example, [Storm's HDFS module]() includes the following `enum` definition (simplified for brevity):
++
++```java
++public static enum Units {
++    KB, MB, GB, TB
++}
++```
++
++And the `org.apache.storm.hdfs.bolt.rotation.FileSizeRotationPolicy` class has the following constructor:
++
++```java
++public FileSizeRotationPolicy(float count, Units units)
++
++```
++The following Flux `component` definition could be used to call the constructor:
++
++```yaml
++  - id: "rotationPolicy"
++    className: "org.apache.storm.hdfs.bolt.rotation.FileSizeRotationPolicy"
++    constructorArgs:
++      - 5.0
++      - MB
++```
++
++The above definition is functionally equivalent to the following Java code:
++
++```java
++// rotate files when they reach 5MB
++FileRotationPolicy rotationPolicy = new FileSizeRotationPolicy(5.0f, Units.MB);
++```
++
++## Topology Config
++The `config` section is simply a map of Storm topology configuration parameters that will be passed to the
++`backtype.storm.StormSubmitter` as an instance of the `backtype.storm.Config` class:
++
++```yaml
++config:
++  topology.workers: 4
++  topology.max.spout.pending: 1000
++  topology.message.timeout.secs: 30
++```
++
++# Existing Topologies
++If you have existing Storm topologies, you can still use Flux to deploy/run/test them. This feature allows you to
++leverage Flux Constructor Arguments, References, Properties, and Topology Config declarations for existing topology
++classes.
++
++The easiest way to use an existing topology class is to define
++a `getTopology()` instance method with one of the following signatures:
++
++```java
++public StormTopology getTopology(Map<String, Object> config)
++```
++or:
++
++```java
++public StormTopology getTopology(Config config)
++```
++
++You could then use the following YAML to configure your topology:
++
++```yaml
++name: "existing-topology"
++topologySource:
++  className: "org.apache.storm.flux.test.SimpleTopology"
++```
++
++If the class you would like to use as a topology source has a different method name (i.e. not `getTopology`), you can
++override it:
++
++```yaml
++name: "existing-topology"
++topologySource:
++  className: "org.apache.storm.flux.test.SimpleTopology"
++  methodName: "getTopologyWithDifferentMethodName"
++```
++
++__N.B.:__ The specified method must accept a single argument of type `java.util.Map<String, Object>` or
++`backtype.storm.Config`, and return a `backtype.storm.generated.StormTopology` object.
++
++# YAML DSL
++## Spouts and Bolts
++Spout and Bolts are configured in their own respective section of the YAML configuration. Spout and Bolt definitions
++are extensions to the `component` definition that add a `parallelism` parameter that sets the parallelism  for a
++component when the topology is deployed.
++
++Because spout and bolt definitions extend `component` they support constructor arguments, references, and properties as
++well.
++
++Shell spout example:
++
++```yaml
++spouts:
++  - id: "sentence-spout"
++    className: "org.apache.storm.flux.spouts.GenericShellSpout"
++    # shell spout constructor takes 2 arguments: String[], String[]
++    constructorArgs:
++      # command line
++      - ["node", "randomsentence.js"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++```
++
++Kafka spout example:
++
++```yaml
++components:
++  - id: "stringScheme"
++    className: "storm.kafka.StringScheme"
++
++  - id: "stringMultiScheme"
++    className: "backtype.storm.spout.SchemeAsMultiScheme"
++    constructorArgs:
++      - ref: "stringScheme"
++
++  - id: "zkHosts"
++    className: "storm.kafka.ZkHosts"
++    constructorArgs:
++      - "localhost:2181"
++
++# Alternative kafka config
++#  - id: "kafkaConfig"
++#    className: "storm.kafka.KafkaConfig"
++#    constructorArgs:
++#      # brokerHosts
++#      - ref: "zkHosts"
++#      # topic
++#      - "myKafkaTopic"
++#      # clientId (optional)
++#      - "myKafkaClientId"
++
++  - id: "spoutConfig"
++    className: "storm.kafka.SpoutConfig"
++    constructorArgs:
++      # brokerHosts
++      - ref: "zkHosts"
++      # topic
++      - "myKafkaTopic"
++      # zkRoot
++      - "/kafkaSpout"
++      # id
++      - "myId"
++    properties:
++      - name: "forceFromStart"
++        value: true
++      - name: "scheme"
++        ref: "stringMultiScheme"
++
++config:
++  topology.workers: 1
++
++# spout definitions
++spouts:
++  - id: "kafka-spout"
++    className: "storm.kafka.KafkaSpout"
++    constructorArgs:
++      - ref: "spoutConfig"
++
++```
++
++Bolt Examples:
++
++```yaml
++# bolt definitions
++bolts:
++  - id: "splitsentence"
++    className: "org.apache.storm.flux.bolts.GenericShellBolt"
++    constructorArgs:
++      # command line
++      - ["python", "splitsentence.py"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++    # ...
++
++  - id: "log"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++    # ...
++
++  - id: "count"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++    # ...
++```
++## Streams and Stream Groupings
++Streams in Flux are represented as a list of connections (Graph edges, data flow, etc.) between the Spouts and Bolts in
++a topology, with an associated Grouping definition.
++
++A Stream definition has the following properties:
++
++**`name`:** A name for the connection (optional, currently unused)
++
++**`from`:** The `id` of a Spout or Bolt that is the source (publisher)
++
++**`to`:** The `id` of a Spout or Bolt that is the destination (subscriber)
++
++**`grouping`:** The stream grouping definition for the Stream
++
++A Grouping definition has the following properties:
++
++**`type`:** The type of grouping. One of `ALL`,`CUSTOM`,`DIRECT`,`SHUFFLE`,`LOCAL_OR_SHUFFLE`,`FIELDS`,`GLOBAL`, or `NONE`.
++
++**`streamId`:** The Storm stream ID (Optional. If unspecified will use the default stream)
++
++**`args`:** For the `FIELDS` grouping, a list of field names.
++
++**`customClass`** For the `CUSTOM` grouping, a definition of custom grouping class instance
++
++The `streams` definition example below sets up a topology with the following wiring:
++
++```
++    kafka-spout --> splitsentence --> count --> log
++```
++
++
++```yaml
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++# custom stream groupings are also supported
++
++streams:
++  - name: "kafka --> split" # name isn't used (placeholder for logging, UI, etc.)
++    from: "kafka-spout"
++    to: "splitsentence"
++    grouping:
++      type: SHUFFLE
++
++  - name: "split --> count"
++    from: "splitsentence"
++    to: "count"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "count --> log"
++    from: "count"
++    to: "log"
++    grouping:
++      type: SHUFFLE
++```
++
++### Custom Stream Groupings
++Custom stream groupings are defined by setting the grouping type to `CUSTOM` and defining a `customClass` parameter
++that tells Flux how to instantiate the custom class. The `customClass` definition extends `component`, so it supports
++constructor arguments, references, and properties as well.
++
++The example below creates a Stream with an instance of the `backtype.storm.testing.NGrouping` custom stream grouping
++class.
++
++```yaml
++  - name: "bolt-1 --> bolt2"
++    from: "bolt-1"
++    to: "bolt-2"
++    grouping:
++      type: CUSTOM
++      customClass:
++        className: "backtype.storm.testing.NGrouping"
++        constructorArgs:
++          - 1
++```
++
++## Includes and Overrides
++Flux allows you to include the contents of other YAML files, and have them treated as though they were defined in the
++same file. Includes may be either files, or classpath resources.
++
++Includes are specified as a list of maps:
++
++```yaml
++includes:
++  - resource: false
++    file: "src/test/resources/configs/shell_test.yaml"
++    override: false
++```
++
++If the `resource` property is set to `true`, the include will be loaded as a classpath resource from the value of the
++`file` attribute, otherwise it will be treated as a regular file.
++
++The `override` property controls how includes affect the values defined in the current file. If `override` is set to
++`true`, values in the included file will replace values in the current file being parsed. If `override` is set to
++`false`, values in the current file being parsed will take precedence, and the parser will refuse to replace them.
++
++**N.B.:** Includes are not yet recursive. Includes from included files will be ignored.
++
++
++## Basic Word Count Example
++
++This example uses a spout implemented in JavaScript, a bolt implemented in Python, and a bolt implemented in Java
++
++Topology YAML config:
++
++```yaml
++---
++name: "shell-topology"
++config:
++  topology.workers: 1
++
++# spout definitions
++spouts:
++  - id: "sentence-spout"
++    className: "org.apache.storm.flux.spouts.GenericShellSpout"
++    # shell spout constructor takes 2 arguments: String[], String[]
++    constructorArgs:
++      # command line
++      - ["node", "randomsentence.js"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++
++# bolt definitions
++bolts:
++  - id: "splitsentence"
++    className: "org.apache.storm.flux.bolts.GenericShellBolt"
++    constructorArgs:
++      # command line
++      - ["python", "splitsentence.py"]
++      # output fields
++      - ["word"]
++    parallelism: 1
++
++  - id: "log"
++    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
++    parallelism: 1
++
++  - id: "count"
++    className: "backtype.storm.testing.TestWordCounter"
++    parallelism: 1
++
++#stream definitions
++# stream definitions define connections between spouts and bolts.
++# note that such connections can be cyclical
++# custom stream groupings are also supported
++
++streams:
++  - name: "spout --> split" # name isn't used (placeholder for logging, UI, etc.)
++    from: "sentence-spout"
++    to: "splitsentence"
++    grouping:
++      type: SHUFFLE
++
++  - name: "split --> count"
++    from: "splitsentence"
++    to: "count"
++    grouping:
++      type: FIELDS
++      args: ["word"]
++
++  - name: "count --> log"
++    from: "count"
++    to: "log"
++    grouping:
++      type: SHUFFLE
++```
++
++
++## Micro-Batching (Trident) API Support
++Currenty, the Flux YAML DSL only supports the Core Storm API, but support for Storm's micro-batching API is planned.
++
++To use Flux with a Trident topology, define a topology getter method and reference it in your YAML config:
++
++```yaml
++name: "my-trident-topology"
++
++config:
++  topology.workers: 1
++
++topologySource:
++  className: "org.apache.storm.flux.test.TridentTopologySource"
++  # Flux will look for "getTopology", this will override that.
++  methodName: "getTopologyWithDifferentMethodName"
++```
++
++## Author
++P. Taylor Goetz
++
++## Contributors
++
++
++## Contributing
++
++Contributions in any form are more than welcome.
++
++The intent of this project is that it will be donated to Apache Storm.
++
++By offering any contributions to this project, you should be willing and able to submit an
++[Apache ICLA](http://www.apache.org/licenses/icla.txt), if you have not done so already.

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/pom.xml
----------------------------------------------------------------------
diff --cc external/flux/flux-core/pom.xml
index 0000000,0000000..600613d
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/pom.xml
@@@ -1,0 -1,0 +1,94 @@@
++<?xml version="1.0" encoding="UTF-8"?>
++<!--
++ Licensed to the Apache Software Foundation (ASF) under one or more
++ contributor license agreements.  See the NOTICE file distributed with
++ this work for additional information regarding copyright ownership.
++ The ASF licenses this file to You under the Apache License, Version 2.0
++ (the "License"); you may not use this file except in compliance with
++ the License.  You may obtain a copy of the License at
++
++     http://www.apache.org/licenses/LICENSE-2.0
++
++ Unless required by applicable law or agreed to in writing, software
++ distributed under the License is distributed on an "AS IS" BASIS,
++ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ See the License for the specific language governing permissions and
++ limitations under the License.
++-->
++<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
++    <modelVersion>4.0.0</modelVersion>
++
++    <parent>
++        <groupId>com.github.ptgoetz</groupId>
++        <artifactId>flux</artifactId>
++        <version>0.3.1-SNAPSHOT</version>
++        <relativePath>../pom.xml</relativePath>
++    </parent>
++
++    <groupId>com.github.ptgoetz</groupId>
++    <artifactId>flux-core</artifactId>
++    <packaging>jar</packaging>
++
++    <name>flux-core</name>
++    <url>https://github.com/ptgoetz/flux</url>
++
++    <dependencies>
++        <dependency>
++            <groupId>com.github.ptgoetz</groupId>
++            <artifactId>flux-wrappers</artifactId>
++            <version>${project.version}</version>
++        </dependency>
++        <dependency>
++            <groupId>org.apache.storm</groupId>
++            <artifactId>storm-kafka</artifactId>
++            <version>${storm.version}</version>
++            <scope>test</scope>
++        </dependency>
++        <dependency>
++            <groupId>org.apache.storm</groupId>
++            <artifactId>storm-hdfs</artifactId>
++            <version>${storm.version}</version>
++            <scope>test</scope>
++        </dependency>
++        <dependency>
++            <groupId>org.apache.storm</groupId>
++            <artifactId>storm-hbase</artifactId>
++            <version>${storm.version}</version>
++            <scope>test</scope>
++        </dependency>
++    </dependencies>
++    <build>
++        <resources>
++            <resource>
++                <directory>src/main/resources</directory>
++                <filtering>true</filtering>
++            </resource>
++        </resources>
++        <plugins>
++        <plugin>
++            <groupId>org.apache.maven.plugins</groupId>
++            <artifactId>maven-shade-plugin</artifactId>
++            <version>1.4</version>
++            <configuration>
++                <createDependencyReducedPom>true</createDependencyReducedPom>
++            </configuration>
++            <executions>
++                <execution>
++                    <phase>package</phase>
++                    <goals>
++                        <goal>shade</goal>
++                    </goals>
++                    <configuration>
++                        <transformers>
++                            <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
++                            <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
++                                <mainClass>org.apache.storm.flux.Flux</mainClass>
++                            </transformer>
++                        </transformers>
++                    </configuration>
++                </execution>
++            </executions>
++        </plugin>
++        </plugins>
++    </build>
++</project>

http://git-wip-us.apache.org/repos/asf/storm/blob/b21a98dd/external/flux/flux-core/src/main/java/org/apache/storm/flux/Flux.java
----------------------------------------------------------------------
diff --cc external/flux/flux-core/src/main/java/org/apache/storm/flux/Flux.java
index 0000000,0000000..6300631
new file mode 100644
--- /dev/null
+++ b/external/flux/flux-core/src/main/java/org/apache/storm/flux/Flux.java
@@@ -1,0 -1,0 +1,263 @@@
++/*
++ * Licensed to the Apache Software Foundation (ASF) under one
++ * or more contributor license agreements.  See the NOTICE file
++ * distributed with this work for additional information
++ * regarding copyright ownership.  The ASF licenses this file
++ * to you under the Apache License, Version 2.0 (the
++ * "License"); you may not use this file except in compliance
++ * with the License.  You may obtain a copy of the License at
++ *
++ * http://www.apache.org/licenses/LICENSE-2.0
++ *
++ * Unless required by applicable law or agreed to in writing, software
++ * distributed under the License is distributed on an "AS IS" BASIS,
++ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
++ * See the License for the specific language governing permissions and
++ * limitations under the License.
++ */
++package org.apache.storm.flux;
++
++import backtype.storm.Config;
++import backtype.storm.LocalCluster;
++import backtype.storm.StormSubmitter;
++import backtype.storm.generated.StormTopology;
++import backtype.storm.generated.SubmitOptions;
++import backtype.storm.generated.TopologyInitialStatus;
++import backtype.storm.utils.Utils;
++import org.apache.commons.cli.*;
++import org.apache.storm.flux.model.*;
++import org.apache.storm.flux.parser.FluxParser;
++import org.slf4j.Logger;
++import org.slf4j.LoggerFactory;
++
++import java.io.*;
++import java.util.Map;
++import java.util.Properties;
++
++/**
++ * Flux entry point.
++ *
++ */
++public class Flux {
++    private static final Logger LOG = LoggerFactory.getLogger(Flux.class);
++
++    private static final Long DEFAULT_LOCAL_SLEEP_TIME = 60000l;
++
++    private static final Long DEFAULT_ZK_PORT = 2181l;
++
++    private static final String OPTION_LOCAL = "local";
++    private static final String OPTION_REMOTE = "remote";
++    private static final String OPTION_RESOURCE = "resource";
++    private static final String OPTION_SLEEP = "sleep";
++    private static final String OPTION_DRY_RUN = "dry-run";
++    private static final String OPTION_NO_DETAIL = "no-detail";
++    private static final String OPTION_NO_SPLASH = "no-splash";
++    private static final String OPTION_INACTIVE = "inactive";
++    private static final String OPTION_ZOOKEEPER = "zookeeper";
++    private static final String OPTION_FILTER = "filter";
++    private static final String OPTION_ENV_FILTER = "env-filter";
++
++    public static void main(String[] args) throws Exception {
++        Options options = new Options();
++
++        options.addOption(option(0, "l", OPTION_LOCAL, "Run the topology in local mode."));
++
++        options.addOption(option(0, "r", OPTION_REMOTE, "Deploy the topology to a remote cluster."));
++
++        options.addOption(option(0, "R", OPTION_RESOURCE, "Treat the supplied path as a classpath resource instead of a file."));
++
++        options.addOption(option(1, "s", OPTION_SLEEP, "ms", "When running locally, the amount of time to sleep (in ms.) " +
++                "before killing the topology and shutting down the local cluster."));
++
++        options.addOption(option(0, "d", OPTION_DRY_RUN, "Do not run or deploy the topology. Just build, validate, " +
++                "and print information about the topology."));
++
++        options.addOption(option(0, "q", OPTION_NO_DETAIL, "Suppress the printing of topology details."));
++
++        options.addOption(option(0, "n", OPTION_NO_SPLASH, "Suppress the printing of the splash screen."));
++
++        options.addOption(option(0, "i", OPTION_INACTIVE, "Deploy the topology, but do not activate it."));
++
++        options.addOption(option(1, "z", OPTION_ZOOKEEPER, "host:port", "When running in local mode, use the ZooKeeper at the " +
++                "specified <host>:<port> instead of the in-process ZooKeeper. (requires Storm 0.9.3 or later)"));
++
++        options.addOption(option(1, "f", OPTION_FILTER, "file", "Perform property substitution. Use the specified file " +
++                "as a source of properties, and replace keys identified with {$[property name]} with the value defined " +
++                "in the properties file."));
++
++        options.addOption(option(0, "e", OPTION_ENV_FILTER, "Perform environment variable substitution. Replace keys" +
++                "identified with `${ENV-[NAME]}` will be replaced with the corresponding `NAME` environment value"));
++
++        CommandLineParser parser = new BasicParser();
++        CommandLine cmd = parser.parse(options, args);
++
++        if (cmd.getArgs().length != 1) {
++            usage(options);
++            System.exit(1);
++        }
++        runCli(cmd);
++    }
++
++    private static Option option(int argCount, String shortName, String longName, String description){
++       return option(argCount, shortName, longName, longName, description);
++    }
++
++    private static Option option(int argCount, String shortName, String longName, String argName, String description){
++        Option option = OptionBuilder.hasArgs(argCount)
++                .withArgName(argName)
++                .withLongOpt(longName)
++                .withDescription(description)
++                .create(shortName);
++        return option;
++    }
++
++    private static void usage(Options options) {
++        HelpFormatter formatter = new HelpFormatter();
++        formatter.printHelp("storm jar <my_topology_uber_jar.jar> " +
++                Flux.class.getName() +
++                " [options] <topology-config.yaml>", options);
++    }
++
++    private static void runCli(CommandLine cmd)throws Exception {
++        if(!cmd.hasOption(OPTION_NO_SPLASH)) {
++            printSplash();
++        }
++
++        boolean dumpYaml = cmd.hasOption("dump-yaml");
++
++        TopologyDef topologyDef = null;
++        String filePath = (String)cmd.getArgList().get(0);
++
++        // TODO conditionally load properties from a file our resource
++        String filterProps = null;
++        if(cmd.hasOption(OPTION_FILTER)){
++            filterProps = cmd.getOptionValue(OPTION_FILTER);
++        }
++
++
++        boolean envFilter = cmd.hasOption(OPTION_ENV_FILTER);
++        if(cmd.hasOption(OPTION_RESOURCE)){
++            printf("Parsing classpath resource: %s", filePath);
++            topologyDef = FluxParser.parseResource(filePath, dumpYaml, true, filterProps, envFilter);
++        } else {
++            printf("Parsing file: %s",
++                    new File(filePath).getAbsolutePath());
++            topologyDef = FluxParser.parseFile(filePath, dumpYaml, true, filterProps, envFilter);
++        }
++
++
++        String topologyName = topologyDef.getName();
++        // merge contents of `config` into topology config
++        Config conf = FluxBuilder.buildConfig(topologyDef);
++        ExecutionContext context = new ExecutionContext(topologyDef, conf);
++        StormTopology topology = FluxBuilder.buildTopology(context);
++
++        if(!cmd.hasOption(OPTION_NO_DETAIL)){
++            printTopologyInfo(context);
++        }
++
++        if(!cmd.hasOption(OPTION_DRY_RUN)) {
++            if (cmd.hasOption(OPTION_REMOTE)) {
++                LOG.info("Running remotely...");
++                try {
++                    // should the topology be active or inactive
++                    SubmitOptions submitOptions = null;
++                    if(cmd.hasOption(OPTION_INACTIVE)){
++                        LOG.info("Deploying topology in an INACTIVE state...");
++                        submitOptions = new SubmitOptions(TopologyInitialStatus.INACTIVE);
++                    } else {
++                        LOG.info("Deploying topology in an ACTIVE state...");
++                        submitOptions = new SubmitOptions(TopologyInitialStatus.ACTIVE);
++                    }
++                    StormSubmitter.submitTopology(topologyName, conf, topology, submitOptions, null);
++                } catch (Exception e) {
++                    LOG.warn("Unable to deploy topology to remote cluster.", e);
++                }
++            } else {
++                LOG.info("Running in local mode...");
++
++                String sleepStr = cmd.getOptionValue(OPTION_SLEEP);
++                Long sleepTime = DEFAULT_LOCAL_SLEEP_TIME;
++                if (sleepStr != null) {
++                    sleepTime = Long.parseLong(sleepStr);
++                }
++                LOG.debug("Sleep time: {}", sleepTime);
++                LocalCluster cluster = null;
++
++                // in-process or external zookeeper
++                if(cmd.hasOption(OPTION_ZOOKEEPER)){
++                    String zkStr = cmd.getOptionValue(OPTION_ZOOKEEPER);
++                    LOG.info("Using ZooKeeper at '{}' instead of in-process one.", zkStr);
++                    long zkPort = DEFAULT_ZK_PORT;
++                    String zkHost = null;
++                    if(zkStr.contains(":")){
++                        String[] hostPort = zkStr.split(":");
++                        zkHost = hostPort[0];
++                        zkPort = hostPort.length > 1 ? Long.parseLong(hostPort[1]) : DEFAULT_ZK_PORT;
++
++                    } else {
++                        zkHost = zkStr;
++                    }
++                    // the following constructor is only available in 0.9.3 and later
++                    try {
++                        cluster = new LocalCluster(zkHost, zkPort);
++                    } catch (NoSuchMethodError e){
++                        LOG.error("The --zookeeper option can only be used with Apache Storm 0.9.3 and later.");
++                        System.exit(1);
++                    }
++                } else {
++                    cluster = new LocalCluster();
++                }
++                cluster.submitTopology(topologyName, conf, topology);
++
++                Utils.sleep(sleepTime);
++                cluster.killTopology(topologyName);
++                cluster.shutdown();
++            }
++        }
++    }
++
++    static void printTopologyInfo(ExecutionContext ctx){
++        TopologyDef t = ctx.getTopologyDef();
++        if(t.isDslTopology()) {
++            print("---------- TOPOLOGY DETAILS ----------");
++
++            printf("Topology Name: %s", t.getName());
++            print("--------------- SPOUTS ---------------");
++            for (SpoutDef s : t.getSpouts()) {
++                printf("%s [%d] (%s)", s.getId(), s.getParallelism(), s.getClassName());
++            }
++            print("---------------- BOLTS ---------------");
++            for (BoltDef b : t.getBolts()) {
++                printf("%s [%d] (%s)", b.getId(), b.getParallelism(), b.getClassName());
++            }
++
++            print("--------------- STREAMS ---------------");
++            for (StreamDef sd : t.getStreams()) {
++                printf("%s --%s--> %s", sd.getFrom(), sd.getGrouping().getType(), sd.getTo());
++            }
++            print("--------------------------------------");
++        }
++    }
++
++    // save a little typing
++    private static void printf(String format, Object... args){
++        print(String.format(format, args));
++    }
++
++    private static void print(String string){
++        System.out.println(string);
++    }
++
++    private static void printSplash() throws IOException {
++        // banner
++        InputStream is = Flux.class.getResourceAsStream("/splash.txt");
++        if(is != null){
++            BufferedReader br = new BufferedReader(new InputStreamReader(is));
++            String line = null;
++            while((line = br.readLine()) != null){
++                System.out.println(line);
++            }
++        }
++    }
++}


[29/50] [abbrv] storm git commit: cleanup github references and versions

Posted by pt...@apache.org.
cleanup github references and versions


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/f7d2f7fa
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/f7d2f7fa
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/f7d2f7fa

Branch: refs/heads/0.10.x-branch
Commit: f7d2f7fa0c7a0511911e984f13c20fceed743b17
Parents: e1e1609
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Thu May 7 10:33:44 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Thu May 7 10:33:44 2015 -0400

----------------------------------------------------------------------
 external/flux/README.md               |  8 ++++----
 external/flux/flux-core/pom.xml       |  1 -
 external/flux/flux-examples/README.md | 14 +++++++-------
 external/flux/flux-examples/pom.xml   |  1 -
 4 files changed, 11 insertions(+), 13 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/f7d2f7fa/external/flux/README.md
----------------------------------------------------------------------
diff --git a/external/flux/README.md b/external/flux/README.md
index d09a73c..0387f3f 100644
--- a/external/flux/README.md
+++ b/external/flux/README.md
@@ -109,9 +109,9 @@ recommended).
 The current version of Flux is available in Maven Central at the following coordinates:
 ```xml
 <dependency>
-    <groupId>com.github.ptgoetz</groupId>
+    <groupId>org.apache.storm</groupId>
     <artifactId>flux-core</artifactId>
-    <version>0.3.0</version>
+    <version>${storm.version}</version>
 </dependency>
 ```
 
@@ -123,9 +123,9 @@ The example below illustrates Flux usage with the Maven shade plugin:
 <dependencies>
     <!-- Flux include -->
     <dependency>
-        <groupId>com.github.ptgoetz</groupId>
+        <groupId>org.apache.storm</groupId>
         <artifactId>flux-core</artifactId>
-        <version>0.3.0</version>
+        <version>${storm.version}</version>
     </dependency>
 
     <!-- add user dependencies here... -->

http://git-wip-us.apache.org/repos/asf/storm/blob/f7d2f7fa/external/flux/flux-core/pom.xml
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/pom.xml b/external/flux/flux-core/pom.xml
index c3842bd..f74d034 100644
--- a/external/flux/flux-core/pom.xml
+++ b/external/flux/flux-core/pom.xml
@@ -29,7 +29,6 @@
     <packaging>jar</packaging>
 
     <name>flux-core</name>
-    <url>https://github.com/ptgoetz/flux</url>
 
     <dependencies>
         <dependency>

http://git-wip-us.apache.org/repos/asf/storm/blob/f7d2f7fa/external/flux/flux-examples/README.md
----------------------------------------------------------------------
diff --git a/external/flux/flux-examples/README.md b/external/flux/flux-examples/README.md
index b3798a6..fceebd8 100644
--- a/external/flux/flux-examples/README.md
+++ b/external/flux/flux-examples/README.md
@@ -6,24 +6,24 @@ A collection of examples illustrating various capabilities.
 Checkout the projects source and perform a top level Maven build (i.e. from the `flux` directory):
 
 ```bash
-git clone https://github.com/ptgoetz/flux.git
-cd flux
-mvn install
+git clone https://github.com/apache/storm.git
+cd storm
+mvn install -DskipTests=true
 ```
 
-This will create a shaded (i.e. "fat" or "uber") jar in the `flux-examples/target` directory that can run/deployed with
+This will create a shaded (i.e. "fat" or "uber") jar in the `external/flux/flux-examples/target` directory that can run/deployed with
 the `storm` command:
 
 ```bash
 cd flux-examples
-storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_wordcount.yaml
+storm jar ./target/flux-examples-*-SNAPSHOT.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_wordcount.yaml
 ```
 
 The example YAML files are also packaged in the examples jar, so they can also be referenced with Flux's `--resource`
 command line switch:
 
 ```bash
-storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local --resource /simple_wordcount.yaml
+storm jar ./target/flux-examples-*.jar org.apache.storm.flux.Flux --local --resource /simple_wordcount.yaml
 ```
 
 ## Available Examples
@@ -51,7 +51,7 @@ To run the `simple_hdfs.yaml` example, copy the `hdfs_bolt.properties` file to a
 least, the property `hdfs.url` to point to a HDFS cluster. Then you can run the example something like:
 
 ```bash
-storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_hdfs.yaml --filter my_hdfs_bolt.properties
+storm jar ./target/flux-examples-*.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_hdfs.yaml --filter my_hdfs_bolt.properties
 ```
 
 ### [simple_hbase.yaml](src/main/resources/simple_hbase.yaml)

http://git-wip-us.apache.org/repos/asf/storm/blob/f7d2f7fa/external/flux/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/external/flux/flux-examples/pom.xml b/external/flux/flux-examples/pom.xml
index 709b20b..571f302 100644
--- a/external/flux/flux-examples/pom.xml
+++ b/external/flux/flux-examples/pom.xml
@@ -29,7 +29,6 @@
     <packaging>jar</packaging>
 
     <name>flux-examples</name>
-    <url>https://github.com/ptgoetz/flux</url>
 
     <dependencies>
         <dependency>


[34/50] [abbrv] storm git commit: Fix line endings so that the diff is meaningful.

Posted by pt...@apache.org.
Fix line endings so that the diff is meaningful.

Signed-off-by: Shanyu Zhao <sh...@microsoft.com>


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/1f13f15d
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/1f13f15d
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/1f13f15d

Branch: refs/heads/0.10.x-branch
Commit: 1f13f15d0c0de233fd4cc4ff6d6de586c4736142
Parents: e515492
Author: Shanyu Zhao <sh...@microsoft.com>
Authored: Wed May 13 14:50:02 2015 -0700
Committer: Shanyu Zhao <sh...@microsoft.com>
Committed: Wed May 13 14:50:02 2015 -0700

----------------------------------------------------------------------
 external/storm-eventhubs/pom.xml                | 236 +++++++++----------
 .../eventhubs/bolt/DefaultEventDataFormat.java  |  94 ++++----
 .../storm/eventhubs/bolt/EventHubBolt.java      | 202 ++++++++--------
 .../eventhubs/bolt/EventHubBoltConfig.java      | 214 ++++++++---------
 .../storm/eventhubs/bolt/IEventDataFormat.java  |  56 ++---
 .../storm/eventhubs/client/EventHubClient.java  | 190 +++++++--------
 .../storm/eventhubs/client/EventHubSender.java  | 198 ++++++++--------
 .../storm/eventhubs/samples/EventHubLoop.java   | 104 ++++----
 8 files changed, 647 insertions(+), 647 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/1f13f15d/external/storm-eventhubs/pom.xml
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/pom.xml b/external/storm-eventhubs/pom.xml
index 2ceed09..2dfb739 100755
--- a/external/storm-eventhubs/pom.xml
+++ b/external/storm-eventhubs/pom.xml
@@ -1,119 +1,119 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
- Licensed to the Apache Software Foundation (ASF) under one or more
- contributor license agreements.  See the NOTICE file distributed with
- this work for additional information regarding copyright ownership.
- The ASF licenses this file to You under the Apache License, Version 2.0
- (the "License"); you may not use this file except in compliance with
- the License.  You may obtain a copy of the License at
-
-     http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
--->
-<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
-    <modelVersion>4.0.0</modelVersion>
-    
-    <parent>
-        <artifactId>storm</artifactId>
-        <groupId>org.apache.storm</groupId>
-        <version>0.11.0-SNAPSHOT</version>
-        <relativePath>../../pom.xml</relativePath>
-    </parent>
-    
-    <artifactId>storm-eventhubs</artifactId>
-    <version>0.11.0-SNAPSHOT</version>
-    <packaging>jar</packaging>
-    <name>storm-eventhubs</name>
-    <description>EventHubs Storm Spout</description>
-
-    <properties>
-        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
-        <qpid.version>0.32</qpid.version>
-    </properties>
-    <build>
-        <plugins>
-            <plugin>
-                <groupId>org.apache.maven.plugins</groupId>
-                <artifactId>maven-shade-plugin</artifactId>
-                <version>2.3</version>
-                <executions>
-                    <execution>
-                        <goals>
-                            <goal>shade</goal>
-                        </goals>
-                        <phase>package</phase>
-                    </execution>
-                </executions>
-                <configuration>
-                    <transformers>
-                        <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer">
-                        </transformer>
-                    </transformers>
-                    <outputFile>target/${project.artifactId}-${project.version}-jar-with-dependencies.jar</outputFile>
-                </configuration>
-	        </plugin>
-            <plugin>
-		        <artifactId>maven-antrun-plugin</artifactId>
-		        <executions>
-		          <execution>
-		            <phase>package</phase>
-		            <configuration>
-		              <tasks>
-		                <copy file="src/main/resources/config.properties" tofile="target/eventhubs-config.properties"/>
-                    </tasks>
-		            </configuration>
-		            <goals>
-		              <goal>run</goal>
-		            </goals>
-		          </execution>
-		        </executions>
-	        </plugin>
-        </plugins>
-    </build>
-    <dependencies>
-        <dependency>
-            <groupId>org.apache.qpid</groupId>
-            <artifactId>qpid-client</artifactId>
-            <version>${qpid.version}</version>
-        </dependency>
-        <dependency>
-            <groupId>org.apache.qpid</groupId>
-            <artifactId>qpid-amqp-1-0-client-jms</artifactId>
-            <version>${qpid.version}</version>
-        </dependency>
-        <dependency>
-            <groupId>org.apache.storm</groupId>
-            <artifactId>storm-core</artifactId>
-            <version>${project.version}</version>
-            <!-- keep storm out of the jar-with-dependencies -->
-            <type>jar</type>
-            <scope>provided</scope>
-        </dependency>
-        <dependency>
-            <groupId>org.apache.curator</groupId>
-            <artifactId>curator-framework</artifactId>
-            <version>${curator.version}</version>
-            <exclusions>
-                <exclusion>
-                    <groupId>log4j</groupId>
-                    <artifactId>log4j</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.slf4j</groupId>
-                    <artifactId>slf4j-log4j12</artifactId>
-                </exclusion>
-            </exclusions>
-        </dependency>
-        <dependency>
-            <groupId>junit</groupId>
-            <artifactId>junit</artifactId>
-            <version>4.11</version>
-            <scope>test</scope>
-        </dependency>
-    </dependencies> 
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one or more
+ contributor license agreements.  See the NOTICE file distributed with
+ this work for additional information regarding copyright ownership.
+ The ASF licenses this file to You under the Apache License, Version 2.0
+ (the "License"); you may not use this file except in compliance with
+ the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+    <modelVersion>4.0.0</modelVersion>
+    
+    <parent>
+        <artifactId>storm</artifactId>
+        <groupId>org.apache.storm</groupId>
+        <version>0.11.0-SNAPSHOT</version>
+        <relativePath>../../pom.xml</relativePath>
+    </parent>
+    
+    <artifactId>storm-eventhubs</artifactId>
+    <version>0.11.0-SNAPSHOT</version>
+    <packaging>jar</packaging>
+    <name>storm-eventhubs</name>
+    <description>EventHubs Storm Spout</description>
+
+    <properties>
+        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
+        <qpid.version>0.32</qpid.version>
+    </properties>
+    <build>
+        <plugins>
+            <plugin>
+                <groupId>org.apache.maven.plugins</groupId>
+                <artifactId>maven-shade-plugin</artifactId>
+                <version>2.3</version>
+                <executions>
+                    <execution>
+                        <goals>
+                            <goal>shade</goal>
+                        </goals>
+                        <phase>package</phase>
+                    </execution>
+                </executions>
+                <configuration>
+                    <transformers>
+                        <transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer">
+                        </transformer>
+                    </transformers>
+                    <outputFile>target/${project.artifactId}-${project.version}-jar-with-dependencies.jar</outputFile>
+                </configuration>
+	        </plugin>
+            <plugin>
+		        <artifactId>maven-antrun-plugin</artifactId>
+		        <executions>
+		          <execution>
+		            <phase>package</phase>
+		            <configuration>
+		              <tasks>
+		                <copy file="src/main/resources/config.properties" tofile="target/eventhubs-config.properties"/>
+                    </tasks>
+		            </configuration>
+		            <goals>
+		              <goal>run</goal>
+		            </goals>
+		          </execution>
+		        </executions>
+	        </plugin>
+        </plugins>
+    </build>
+    <dependencies>
+        <dependency>
+            <groupId>org.apache.qpid</groupId>
+            <artifactId>qpid-client</artifactId>
+            <version>${qpid.version}</version>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.qpid</groupId>
+            <artifactId>qpid-amqp-1-0-client-jms</artifactId>
+            <version>${qpid.version}</version>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.storm</groupId>
+            <artifactId>storm-core</artifactId>
+            <version>${project.version}</version>
+            <!-- keep storm out of the jar-with-dependencies -->
+            <type>jar</type>
+            <scope>provided</scope>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.curator</groupId>
+            <artifactId>curator-framework</artifactId>
+            <version>${curator.version}</version>
+            <exclusions>
+                <exclusion>
+                    <groupId>log4j</groupId>
+                    <artifactId>log4j</artifactId>
+                </exclusion>
+                <exclusion>
+                    <groupId>org.slf4j</groupId>
+                    <artifactId>slf4j-log4j12</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
+        <dependency>
+            <groupId>junit</groupId>
+            <artifactId>junit</artifactId>
+            <version>4.11</version>
+            <scope>test</scope>
+        </dependency>
+    </dependencies> 
 </project>
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/1f13f15d/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/DefaultEventDataFormat.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/DefaultEventDataFormat.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/DefaultEventDataFormat.java
index 1bd8288..6b3eba7 100644
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/DefaultEventDataFormat.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/DefaultEventDataFormat.java
@@ -1,47 +1,47 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.bolt;
-
-import backtype.storm.tuple.Tuple;
-
-/**
- * A default implementation of IEventDataFormat that converts the tuple
- * into a delimited string.
- */
-public class DefaultEventDataFormat implements IEventDataFormat {
-  private static final long serialVersionUID = 1L;
-  private String delimiter = ",";
-  
-  public DefaultEventDataFormat withFieldDelimiter(String delimiter) {
-    this.delimiter = delimiter;
-    return this;
-  }
-
-  @Override
-  public byte[] serialize(Tuple tuple) {
-    StringBuilder sb = new StringBuilder();
-    for(Object obj : tuple.getValues()) {
-      if(sb.length() != 0) {
-        sb.append(delimiter);
-      }
-      sb.append(obj.toString());
-    }
-    return sb.toString().getBytes();
-  }
-
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.bolt;
+
+import backtype.storm.tuple.Tuple;
+
+/**
+ * A default implementation of IEventDataFormat that converts the tuple
+ * into a delimited string.
+ */
+public class DefaultEventDataFormat implements IEventDataFormat {
+  private static final long serialVersionUID = 1L;
+  private String delimiter = ",";
+  
+  public DefaultEventDataFormat withFieldDelimiter(String delimiter) {
+    this.delimiter = delimiter;
+    return this;
+  }
+
+  @Override
+  public byte[] serialize(Tuple tuple) {
+    StringBuilder sb = new StringBuilder();
+    for(Object obj : tuple.getValues()) {
+      if(sb.length() != 0) {
+        sb.append(delimiter);
+      }
+      sb.append(obj.toString());
+    }
+    return sb.toString().getBytes();
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/1f13f15d/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
index 09f90b1..a817744 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBolt.java
@@ -1,101 +1,101 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.bolt;
-
-import java.util.Map;
-
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import org.apache.storm.eventhubs.client.EventHubClient;
-import org.apache.storm.eventhubs.client.EventHubException;
-import org.apache.storm.eventhubs.client.EventHubSender;
-
-import backtype.storm.task.OutputCollector;
-import backtype.storm.task.TopologyContext;
-import backtype.storm.topology.OutputFieldsDeclarer;
-import backtype.storm.topology.base.BaseRichBolt;
-import backtype.storm.tuple.Tuple;
-
-/**
- * A bolt that writes event message to EventHub.
- */
-public class EventHubBolt extends BaseRichBolt {
-  private static final long serialVersionUID = 1L;
-  private static final Logger logger = LoggerFactory
-      .getLogger(EventHubBolt.class);
-  
-  protected OutputCollector collector;
-  protected EventHubSender sender;
-  protected EventHubBoltConfig boltConfig;
-  
-  
-  public EventHubBolt(String connectionString, String entityPath) {
-    boltConfig = new EventHubBoltConfig(connectionString, entityPath);
-  }
-
-  public EventHubBolt(String userName, String password, String namespace,
-      String entityPath, boolean partitionMode) {
-    boltConfig = new EventHubBoltConfig(userName, password, namespace,
-        entityPath, partitionMode);
-  }
-  
-  public EventHubBolt(EventHubBoltConfig config) {
-    boltConfig = config;
-  }
-
-  @Override
-  public void prepare(Map config, TopologyContext context, OutputCollector collector) {
-    this.collector = collector;
-    String myPartitionId = null;
-    if(boltConfig.getPartitionMode()) {
-      //We can use the task index (starting from 0) as the partition ID
-      myPartitionId = "" + context.getThisTaskIndex();
-    }
-    logger.info("creating sender: " + boltConfig.getConnectionString()
-        + ", " + boltConfig.getEntityPath() + ", " + myPartitionId);
-    try {
-      EventHubClient eventHubClient = EventHubClient.create(
-          boltConfig.getConnectionString(), boltConfig.getEntityPath());
-      sender = eventHubClient.createPartitionSender(myPartitionId);
-    }
-    catch(Exception ex) {
-      logger.error(ex.getMessage());
-      throw new RuntimeException(ex);
-    }
-
-  }
-
-  @Override
-  public void execute(Tuple tuple) {
-    try {
-      sender.send(boltConfig.getEventDataFormat().serialize(tuple));
-      collector.ack(tuple);
-    }
-    catch(EventHubException ex) {
-      logger.error(ex.getMessage());
-      collector.fail(tuple);
-    }
-  }
-
-  @Override
-  public void declareOutputFields(OutputFieldsDeclarer declarer) {
-    
-  }
-
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.bolt;
+
+import java.util.Map;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import org.apache.storm.eventhubs.client.EventHubClient;
+import org.apache.storm.eventhubs.client.EventHubException;
+import org.apache.storm.eventhubs.client.EventHubSender;
+
+import backtype.storm.task.OutputCollector;
+import backtype.storm.task.TopologyContext;
+import backtype.storm.topology.OutputFieldsDeclarer;
+import backtype.storm.topology.base.BaseRichBolt;
+import backtype.storm.tuple.Tuple;
+
+/**
+ * A bolt that writes event message to EventHub.
+ */
+public class EventHubBolt extends BaseRichBolt {
+  private static final long serialVersionUID = 1L;
+  private static final Logger logger = LoggerFactory
+      .getLogger(EventHubBolt.class);
+  
+  protected OutputCollector collector;
+  protected EventHubSender sender;
+  protected EventHubBoltConfig boltConfig;
+  
+  
+  public EventHubBolt(String connectionString, String entityPath) {
+    boltConfig = new EventHubBoltConfig(connectionString, entityPath);
+  }
+
+  public EventHubBolt(String userName, String password, String namespace,
+      String entityPath, boolean partitionMode) {
+    boltConfig = new EventHubBoltConfig(userName, password, namespace,
+        entityPath, partitionMode);
+  }
+  
+  public EventHubBolt(EventHubBoltConfig config) {
+    boltConfig = config;
+  }
+
+  @Override
+  public void prepare(Map config, TopologyContext context, OutputCollector collector) {
+    this.collector = collector;
+    String myPartitionId = null;
+    if(boltConfig.getPartitionMode()) {
+      //We can use the task index (starting from 0) as the partition ID
+      myPartitionId = "" + context.getThisTaskIndex();
+    }
+    logger.info("creating sender: " + boltConfig.getConnectionString()
+        + ", " + boltConfig.getEntityPath() + ", " + myPartitionId);
+    try {
+      EventHubClient eventHubClient = EventHubClient.create(
+          boltConfig.getConnectionString(), boltConfig.getEntityPath());
+      sender = eventHubClient.createPartitionSender(myPartitionId);
+    }
+    catch(Exception ex) {
+      logger.error(ex.getMessage());
+      throw new RuntimeException(ex);
+    }
+
+  }
+
+  @Override
+  public void execute(Tuple tuple) {
+    try {
+      sender.send(boltConfig.getEventDataFormat().serialize(tuple));
+      collector.ack(tuple);
+    }
+    catch(EventHubException ex) {
+      logger.error(ex.getMessage());
+      collector.fail(tuple);
+    }
+  }
+
+  @Override
+  public void declareOutputFields(OutputFieldsDeclarer declarer) {
+    
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/1f13f15d/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
index 909e8ac..4383a72 100644
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/EventHubBoltConfig.java
@@ -1,107 +1,107 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.bolt;
-
-import java.io.Serializable;
-
-import org.apache.storm.eventhubs.spout.EventHubSpoutConfig;
-
-/*
- * EventHubs bolt configurations
- *
- * Partition mode:
- * With partitionMode=true you need to create the same number of tasks as the number of 
- * EventHubs partitions, and each bolt task will only send data to one partition.
- * The partition ID is the task ID of the bolt.
- * 
- * Event format:
- * The formatter to convert tuple to bytes for EventHubs.
- * if null, the default format is common delimited tuple fields.
- */
-public class EventHubBoltConfig implements Serializable {
-  private static final long serialVersionUID = 1L;
-  
-  private String connectionString;
-  private final String entityPath;
-  protected boolean partitionMode;
-  protected IEventDataFormat dataFormat;
-  
-  public EventHubBoltConfig(String connectionString, String entityPath) {
-    this(connectionString, entityPath, false, null);
-  }
-  
-  public EventHubBoltConfig(String connectionString, String entityPath,
-      boolean partitionMode) {
-    this(connectionString, entityPath, partitionMode, null);
-  }
-  
-  public EventHubBoltConfig(String userName, String password, String namespace,
-      String entityPath, boolean partitionMode) {
-    this(userName, password, namespace,
-        EventHubSpoutConfig.EH_SERVICE_FQDN_SUFFIX, entityPath, partitionMode);
-  }
-  
-  public EventHubBoltConfig(String connectionString, String entityPath,
-      boolean partitionMode, IEventDataFormat dataFormat) {
-    this.connectionString = connectionString;
-    this.entityPath = entityPath;
-    this.partitionMode = partitionMode;
-    this.dataFormat = dataFormat;
-    if(this.dataFormat == null) {
-      this.dataFormat = new DefaultEventDataFormat();
-    }
-  }
-  
-  public EventHubBoltConfig(String userName, String password, String namespace,
-      String targetFqnAddress, String entityPath) {
-    this(userName, password, namespace, targetFqnAddress, entityPath, false, null);
-  }
-  
-  public EventHubBoltConfig(String userName, String password, String namespace,
-      String targetFqnAddress, String entityPath, boolean partitionMode) {
-    this(userName, password, namespace, targetFqnAddress, entityPath, partitionMode, null);
-  }
-  
-  public EventHubBoltConfig(String userName, String password, String namespace,
-      String targetFqnAddress, String entityPath, boolean partitionMode,
-      IEventDataFormat dataFormat) {
-    this.connectionString = EventHubSpoutConfig.buildConnectionString(userName, password, namespace, targetFqnAddress);
-    this.entityPath = entityPath;
-    this.partitionMode = partitionMode;
-    this.dataFormat = dataFormat;
-    if(this.dataFormat == null) {
-      this.dataFormat = new DefaultEventDataFormat();
-    }
-  }
-  
-  public String getConnectionString() {
-    return connectionString;
-  }
-  
-  public String getEntityPath() {
-    return entityPath;
-  }
-  
-  public boolean getPartitionMode() {
-    return partitionMode;
-  }
-  
-  public IEventDataFormat getEventDataFormat() {
-    return dataFormat;
-  }
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.bolt;
+
+import java.io.Serializable;
+
+import org.apache.storm.eventhubs.spout.EventHubSpoutConfig;
+
+/*
+ * EventHubs bolt configurations
+ *
+ * Partition mode:
+ * With partitionMode=true you need to create the same number of tasks as the number of 
+ * EventHubs partitions, and each bolt task will only send data to one partition.
+ * The partition ID is the task ID of the bolt.
+ * 
+ * Event format:
+ * The formatter to convert tuple to bytes for EventHubs.
+ * if null, the default format is common delimited tuple fields.
+ */
+public class EventHubBoltConfig implements Serializable {
+  private static final long serialVersionUID = 1L;
+  
+  private String connectionString;
+  private final String entityPath;
+  protected boolean partitionMode;
+  protected IEventDataFormat dataFormat;
+  
+  public EventHubBoltConfig(String connectionString, String entityPath) {
+    this(connectionString, entityPath, false, null);
+  }
+  
+  public EventHubBoltConfig(String connectionString, String entityPath,
+      boolean partitionMode) {
+    this(connectionString, entityPath, partitionMode, null);
+  }
+  
+  public EventHubBoltConfig(String userName, String password, String namespace,
+      String entityPath, boolean partitionMode) {
+    this(userName, password, namespace,
+        EventHubSpoutConfig.EH_SERVICE_FQDN_SUFFIX, entityPath, partitionMode);
+  }
+  
+  public EventHubBoltConfig(String connectionString, String entityPath,
+      boolean partitionMode, IEventDataFormat dataFormat) {
+    this.connectionString = connectionString;
+    this.entityPath = entityPath;
+    this.partitionMode = partitionMode;
+    this.dataFormat = dataFormat;
+    if(this.dataFormat == null) {
+      this.dataFormat = new DefaultEventDataFormat();
+    }
+  }
+  
+  public EventHubBoltConfig(String userName, String password, String namespace,
+      String targetFqnAddress, String entityPath) {
+    this(userName, password, namespace, targetFqnAddress, entityPath, false, null);
+  }
+  
+  public EventHubBoltConfig(String userName, String password, String namespace,
+      String targetFqnAddress, String entityPath, boolean partitionMode) {
+    this(userName, password, namespace, targetFqnAddress, entityPath, partitionMode, null);
+  }
+  
+  public EventHubBoltConfig(String userName, String password, String namespace,
+      String targetFqnAddress, String entityPath, boolean partitionMode,
+      IEventDataFormat dataFormat) {
+    this.connectionString = EventHubSpoutConfig.buildConnectionString(userName, password, namespace, targetFqnAddress);
+    this.entityPath = entityPath;
+    this.partitionMode = partitionMode;
+    this.dataFormat = dataFormat;
+    if(this.dataFormat == null) {
+      this.dataFormat = new DefaultEventDataFormat();
+    }
+  }
+  
+  public String getConnectionString() {
+    return connectionString;
+  }
+  
+  public String getEntityPath() {
+    return entityPath;
+  }
+  
+  public boolean getPartitionMode() {
+    return partitionMode;
+  }
+  
+  public IEventDataFormat getEventDataFormat() {
+    return dataFormat;
+  }
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/1f13f15d/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/IEventDataFormat.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/IEventDataFormat.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/IEventDataFormat.java
index cb05c0f..2003c34 100644
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/IEventDataFormat.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/bolt/IEventDataFormat.java
@@ -1,28 +1,28 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.bolt;
-
-import java.io.Serializable;
-import backtype.storm.tuple.Tuple;
-
-/**
- * Serialize a tuple to a byte array to be sent to EventHubs
- */
-public interface IEventDataFormat extends Serializable {
-  public byte[] serialize(Tuple tuple);
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.bolt;
+
+import java.io.Serializable;
+import backtype.storm.tuple.Tuple;
+
+/**
+ * Serialize a tuple to a byte array to be sent to EventHubs
+ */
+public interface IEventDataFormat extends Serializable {
+  public byte[] serialize(Tuple tuple);
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/1f13f15d/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
index 2afe5b4..564a26f 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubClient.java
@@ -1,95 +1,95 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import org.apache.qpid.amqp_1_0.client.Connection;
-import org.apache.qpid.amqp_1_0.client.ConnectionErrorException;
-import org.apache.qpid.amqp_1_0.client.ConnectionException;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-public class EventHubClient {
-
-  private static final String DefaultConsumerGroupName = "$default";
-  private static final Logger logger = LoggerFactory.getLogger(EventHubClient.class);
-  private static final long ConnectionSyncTimeout = 60000L;
-
-  private final String connectionString;
-  private final String entityPath;
-  private final Connection connection;
-
-  private EventHubClient(String connectionString, String entityPath) throws EventHubException {
-    this.connectionString = connectionString;
-    this.entityPath = entityPath;
-    this.connection = this.createConnection();
-  }
-
-  /**
-   * creates a new instance of EventHubClient using the supplied connection string and entity path.
-   *
-   * @param connectionString connection string to the namespace of event hubs. connection string format:
-   * amqps://{userId}:{password}@{namespaceName}.servicebus.windows.net
-   * @param entityPath the name of event hub entity.
-   *
-   * @return EventHubClient
-   * @throws org.apache.storm.eventhubs.client.EventHubException
-   */
-  public static EventHubClient create(String connectionString, String entityPath) throws EventHubException {
-    return new EventHubClient(connectionString, entityPath);
-  }
-
-  public EventHubSender createPartitionSender(String partitionId) throws Exception {
-    return new EventHubSender(this.connection.createSession(), this.entityPath, partitionId);
-  }
-
-  public EventHubConsumerGroup getConsumerGroup(String cgName) {
-    if(cgName == null || cgName.length() == 0) {
-      cgName = DefaultConsumerGroupName;
-    }
-    return new EventHubConsumerGroup(connection, entityPath, cgName);
-  }
-
-  public void close() {
-    try {
-      this.connection.close();
-    } catch (ConnectionErrorException e) {
-      logger.error(e.toString());
-    }
-  }
-
-  private Connection createConnection() throws EventHubException {
-    ConnectionStringBuilder connectionStringBuilder = new ConnectionStringBuilder(this.connectionString);
-    Connection clientConnection;
-
-    try {
-      clientConnection = new Connection(
-        connectionStringBuilder.getHost(),
-        connectionStringBuilder.getPort(),
-        connectionStringBuilder.getUserName(),
-        connectionStringBuilder.getPassword(),
-        connectionStringBuilder.getHost(),
-        connectionStringBuilder.getSsl());
-    } catch (ConnectionException e) {
-      logger.error(e.toString());
-      throw new EventHubException(e);
-    }
-    clientConnection.getEndpoint().setSyncTimeout(ConnectionSyncTimeout);
-    SelectorFilterWriter.register(clientConnection.getEndpoint().getDescribedTypeRegistry());
-    return clientConnection;
-  }
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.client;
+
+import org.apache.qpid.amqp_1_0.client.Connection;
+import org.apache.qpid.amqp_1_0.client.ConnectionErrorException;
+import org.apache.qpid.amqp_1_0.client.ConnectionException;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class EventHubClient {
+
+  private static final String DefaultConsumerGroupName = "$default";
+  private static final Logger logger = LoggerFactory.getLogger(EventHubClient.class);
+  private static final long ConnectionSyncTimeout = 60000L;
+
+  private final String connectionString;
+  private final String entityPath;
+  private final Connection connection;
+
+  private EventHubClient(String connectionString, String entityPath) throws EventHubException {
+    this.connectionString = connectionString;
+    this.entityPath = entityPath;
+    this.connection = this.createConnection();
+  }
+
+  /**
+   * creates a new instance of EventHubClient using the supplied connection string and entity path.
+   *
+   * @param connectionString connection string to the namespace of event hubs. connection string format:
+   * amqps://{userId}:{password}@{namespaceName}.servicebus.windows.net
+   * @param entityPath the name of event hub entity.
+   *
+   * @return EventHubClient
+   * @throws org.apache.storm.eventhubs.client.EventHubException
+   */
+  public static EventHubClient create(String connectionString, String entityPath) throws EventHubException {
+    return new EventHubClient(connectionString, entityPath);
+  }
+
+  public EventHubSender createPartitionSender(String partitionId) throws Exception {
+    return new EventHubSender(this.connection.createSession(), this.entityPath, partitionId);
+  }
+
+  public EventHubConsumerGroup getConsumerGroup(String cgName) {
+    if(cgName == null || cgName.length() == 0) {
+      cgName = DefaultConsumerGroupName;
+    }
+    return new EventHubConsumerGroup(connection, entityPath, cgName);
+  }
+
+  public void close() {
+    try {
+      this.connection.close();
+    } catch (ConnectionErrorException e) {
+      logger.error(e.toString());
+    }
+  }
+
+  private Connection createConnection() throws EventHubException {
+    ConnectionStringBuilder connectionStringBuilder = new ConnectionStringBuilder(this.connectionString);
+    Connection clientConnection;
+
+    try {
+      clientConnection = new Connection(
+        connectionStringBuilder.getHost(),
+        connectionStringBuilder.getPort(),
+        connectionStringBuilder.getUserName(),
+        connectionStringBuilder.getPassword(),
+        connectionStringBuilder.getHost(),
+        connectionStringBuilder.getSsl());
+    } catch (ConnectionException e) {
+      logger.error(e.toString());
+      throw new EventHubException(e);
+    }
+    clientConnection.getEndpoint().setSyncTimeout(ConnectionSyncTimeout);
+    SelectorFilterWriter.register(clientConnection.getEndpoint().getDescribedTypeRegistry());
+    return clientConnection;
+  }
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/1f13f15d/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
index 7c45578..435893e 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/client/EventHubSender.java
@@ -1,99 +1,99 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.client;
-
-import java.util.concurrent.TimeoutException;
-import org.apache.qpid.amqp_1_0.client.LinkDetachedException;
-import org.apache.qpid.amqp_1_0.client.Message;
-import org.apache.qpid.amqp_1_0.client.Sender;
-import org.apache.qpid.amqp_1_0.client.Session;
-import org.apache.qpid.amqp_1_0.type.Binary;
-import org.apache.qpid.amqp_1_0.type.messaging.Data;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-public class EventHubSender {
-
-  private static final Logger logger = LoggerFactory.getLogger(EventHubSender.class);
-
-  private final Session session;
-  private final String entityPath;
-  private final String partitionId;
-  private final String destinationAddress;
-
-  private Sender sender;
-
-  public EventHubSender(Session session, String entityPath, String partitionId) {
-    this.session = session;
-    this.entityPath = entityPath;
-    this.partitionId = partitionId;
-    this.destinationAddress = this.getDestinationAddress();
-  }
-  
-  public void send(byte[] data) throws EventHubException {
-    try {
-      if (this.sender == null) {
-        this.ensureSenderCreated();
-      }
-
-      Binary bin = new Binary(data);
-      Message message = new Message(new Data(bin));
-      this.sender.send(message);
-
-    } catch (LinkDetachedException e) {
-      logger.error(e.getMessage());
-
-      EventHubException eventHubException = new EventHubException("Sender has been closed");
-      throw eventHubException;
-    } catch (TimeoutException e) {
-      logger.error(e.getMessage());
-
-      EventHubException eventHubException = new EventHubException("Timed out while waiting to get credit to send");
-      throw eventHubException;
-    } catch (Exception e) {
-      logger.error(e.getMessage());
-    }
-  }
-
-  public void send(String data) throws EventHubException {
-    //For interop with other language, convert string to bytes
-    send(data.getBytes());
-  }
-
-  public void close() {
-    try {
-      this.sender.close();
-    } catch (Sender.SenderClosingException e) {
-      logger.error("Closing a sender encountered error: " + e.getMessage());
-    }
-  }
-
-  private String getDestinationAddress() {
-    if (this.partitionId == null || this.partitionId.equals("")) {
-      return this.entityPath;
-    } else {
-      return String.format(Constants.DestinationAddressFormatString, this.entityPath, this.partitionId);
-    }
-  }
-
-  private synchronized void ensureSenderCreated() throws Exception {
-    if (this.sender == null) {
-      this.sender = this.session.createSender(this.destinationAddress);
-    }
-  }
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.client;
+
+import java.util.concurrent.TimeoutException;
+import org.apache.qpid.amqp_1_0.client.LinkDetachedException;
+import org.apache.qpid.amqp_1_0.client.Message;
+import org.apache.qpid.amqp_1_0.client.Sender;
+import org.apache.qpid.amqp_1_0.client.Session;
+import org.apache.qpid.amqp_1_0.type.Binary;
+import org.apache.qpid.amqp_1_0.type.messaging.Data;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class EventHubSender {
+
+  private static final Logger logger = LoggerFactory.getLogger(EventHubSender.class);
+
+  private final Session session;
+  private final String entityPath;
+  private final String partitionId;
+  private final String destinationAddress;
+
+  private Sender sender;
+
+  public EventHubSender(Session session, String entityPath, String partitionId) {
+    this.session = session;
+    this.entityPath = entityPath;
+    this.partitionId = partitionId;
+    this.destinationAddress = this.getDestinationAddress();
+  }
+  
+  public void send(byte[] data) throws EventHubException {
+    try {
+      if (this.sender == null) {
+        this.ensureSenderCreated();
+      }
+
+      Binary bin = new Binary(data);
+      Message message = new Message(new Data(bin));
+      this.sender.send(message);
+
+    } catch (LinkDetachedException e) {
+      logger.error(e.getMessage());
+
+      EventHubException eventHubException = new EventHubException("Sender has been closed");
+      throw eventHubException;
+    } catch (TimeoutException e) {
+      logger.error(e.getMessage());
+
+      EventHubException eventHubException = new EventHubException("Timed out while waiting to get credit to send");
+      throw eventHubException;
+    } catch (Exception e) {
+      logger.error(e.getMessage());
+    }
+  }
+
+  public void send(String data) throws EventHubException {
+    //For interop with other language, convert string to bytes
+    send(data.getBytes());
+  }
+
+  public void close() {
+    try {
+      this.sender.close();
+    } catch (Sender.SenderClosingException e) {
+      logger.error("Closing a sender encountered error: " + e.getMessage());
+    }
+  }
+
+  private String getDestinationAddress() {
+    if (this.partitionId == null || this.partitionId.equals("")) {
+      return this.entityPath;
+    } else {
+      return String.format(Constants.DestinationAddressFormatString, this.entityPath, this.partitionId);
+    }
+  }
+
+  private synchronized void ensureSenderCreated() throws Exception {
+    if (this.sender == null) {
+      this.sender = this.session.createSender(this.destinationAddress);
+    }
+  }
+}

http://git-wip-us.apache.org/repos/asf/storm/blob/1f13f15d/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java
index c908f9d..2f62a23 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/samples/EventHubLoop.java
@@ -1,52 +1,52 @@
-/*******************************************************************************
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- *******************************************************************************/
-package org.apache.storm.eventhubs.samples;
-
-import backtype.storm.generated.StormTopology;
-import backtype.storm.topology.TopologyBuilder;
-
-import org.apache.storm.eventhubs.bolt.EventHubBolt;
-import org.apache.storm.eventhubs.bolt.EventHubBoltConfig;
-import org.apache.storm.eventhubs.spout.EventHubSpout;
-
-/**
- * A sample topology that loops message back to EventHub
- */
-public class EventHubLoop extends EventCount {
-
-  @Override
-  protected StormTopology buildTopology(EventHubSpout eventHubSpout) {
-    TopologyBuilder topologyBuilder = new TopologyBuilder();
-
-    topologyBuilder.setSpout("EventHubsSpout", eventHubSpout, spoutConfig.getPartitionCount())
-      .setNumTasks(spoutConfig.getPartitionCount());
-    EventHubBoltConfig boltConfig = new EventHubBoltConfig(spoutConfig.getConnectionString(),
-        spoutConfig.getEntityPath(), true);
-    
-    EventHubBolt eventHubBolt = new EventHubBolt(boltConfig);
-    int boltTasks = spoutConfig.getPartitionCount();
-    topologyBuilder.setBolt("EventHubsBolt", eventHubBolt, boltTasks)
-      .localOrShuffleGrouping("EventHubsSpout").setNumTasks(boltTasks);
-    return topologyBuilder.createTopology();
-  }
-  
-  public static void main(String[] args) throws Exception {
-    EventHubLoop scenario = new EventHubLoop();
-    scenario.runScenario(args);
-  }
-}
+/*******************************************************************************
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *******************************************************************************/
+package org.apache.storm.eventhubs.samples;
+
+import backtype.storm.generated.StormTopology;
+import backtype.storm.topology.TopologyBuilder;
+
+import org.apache.storm.eventhubs.bolt.EventHubBolt;
+import org.apache.storm.eventhubs.bolt.EventHubBoltConfig;
+import org.apache.storm.eventhubs.spout.EventHubSpout;
+
+/**
+ * A sample topology that loops message back to EventHub
+ */
+public class EventHubLoop extends EventCount {
+
+  @Override
+  protected StormTopology buildTopology(EventHubSpout eventHubSpout) {
+    TopologyBuilder topologyBuilder = new TopologyBuilder();
+
+    topologyBuilder.setSpout("EventHubsSpout", eventHubSpout, spoutConfig.getPartitionCount())
+      .setNumTasks(spoutConfig.getPartitionCount());
+    EventHubBoltConfig boltConfig = new EventHubBoltConfig(spoutConfig.getConnectionString(),
+        spoutConfig.getEntityPath(), true);
+    
+    EventHubBolt eventHubBolt = new EventHubBolt(boltConfig);
+    int boltTasks = spoutConfig.getPartitionCount();
+    topologyBuilder.setBolt("EventHubsBolt", eventHubBolt, boltTasks)
+      .localOrShuffleGrouping("EventHubsSpout").setNumTasks(boltTasks);
+    return topologyBuilder.createTopology();
+  }
+  
+  public static void main(String[] args) throws Exception {
+    EventHubLoop scenario = new EventHubLoop();
+    scenario.runScenario(args);
+  }
+}


[22/50] [abbrv] storm git commit: [maven-release-plugin] prepare for next development iteration

Posted by pt...@apache.org.
[maven-release-plugin] prepare for next development iteration


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/b372a119
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/b372a119
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/b372a119

Branch: refs/heads/0.10.x-branch
Commit: b372a1198f2227ccc2951b1f5ffccd21a98b589b
Parents: 20a3011
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue May 5 16:54:05 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue May 5 16:54:05 2015 -0400

----------------------------------------------------------------------
 flux-core/pom.xml     | 2 +-
 flux-examples/pom.xml | 2 +-
 flux-wrappers/pom.xml | 2 +-
 pom.xml               | 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/b372a119/flux-core/pom.xml
----------------------------------------------------------------------
diff --git a/flux-core/pom.xml b/flux-core/pom.xml
index 12312f5..600613d 100644
--- a/flux-core/pom.xml
+++ b/flux-core/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.3.0</version>
+        <version>0.3.1-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/b372a119/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/flux-examples/pom.xml b/flux-examples/pom.xml
index e186b9c..0b9796e 100644
--- a/flux-examples/pom.xml
+++ b/flux-examples/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.3.0</version>
+        <version>0.3.1-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/b372a119/flux-wrappers/pom.xml
----------------------------------------------------------------------
diff --git a/flux-wrappers/pom.xml b/flux-wrappers/pom.xml
index 532da15..6784141 100644
--- a/flux-wrappers/pom.xml
+++ b/flux-wrappers/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.3.0</version>
+        <version>0.3.1-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/b372a119/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index 942cbeb..5ea1b40 100644
--- a/pom.xml
+++ b/pom.xml
@@ -20,7 +20,7 @@
 
     <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux</artifactId>
-    <version>0.3.0</version>
+    <version>0.3.1-SNAPSHOT</version>
     <packaging>pom</packaging>
     <name>flux</name>
     <url>https://github.com/ptgoetz/flux</url>


[16/50] [abbrv] storm git commit: fix validation

Posted by pt...@apache.org.
fix validation


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/f432abf5
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/f432abf5
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/f432abf5

Branch: refs/heads/0.10.x-branch
Commit: f432abf523a54806e4b927f676b93575bdf687f2
Parents: abf2924
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Fri Apr 10 15:47:39 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Fri Apr 10 15:47:39 2015 -0400

----------------------------------------------------------------------
 .../apache/storm/flux/model/TopologyDef.java    | 23 +++++++++++---------
 1 file changed, 13 insertions(+), 10 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/f432abf5/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
----------------------------------------------------------------------
diff --git a/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java b/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
index 3be7dd6..6c34018 100644
--- a/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
+++ b/flux-core/src/main/java/org/apache/storm/flux/model/TopologyDef.java
@@ -196,19 +196,22 @@ public class TopologyDef {
     }
 
     public boolean isDslTopology(){
-        boolean hasSpouts = this.spoutMap != null && this.spoutMap.size() > 0;
-        boolean hasBolts = this.boltMap != null && this.boltMap.size() > 0;
-        boolean hasStreams = this.streams != null && this.streams.size() > 0;
-        boolean isDslTopology = hasSpouts || hasBolts || hasStreams;
-
-        return isDslTopology;
+        return this.topologySource == null;
     }
 
 
     public boolean validate(){
-        // we can't have a topology source and spout/bolt/stream definitions at the same time
-        boolean isDslTopology = isDslTopology();
-        boolean isTopologySource = this.topologySource != null;
-        return !(isDslTopology && isTopologySource);
+        boolean hasSpouts = this.spoutMap != null && this.spoutMap.size() > 0;
+        boolean hasBolts = this.boltMap != null && this.boltMap.size() > 0;
+        boolean hasStreams = this.streams != null && this.streams.size() > 0;
+        boolean hasSpoutsBoltsStreams = hasStreams && hasBolts && hasSpouts;
+        // you cant define a topologySource and a DSL topology at the same time...
+        if (!isDslTopology() && ((hasSpouts || hasBolts || hasStreams))) {
+            return false;
+        }
+        if(isDslTopology() && (hasSpouts && hasBolts && hasStreams)) {
+            return true;
+        }
+        return true;
     }
 }


[30/50] [abbrv] storm git commit: add back missing multi-lang scripts

Posted by pt...@apache.org.
add back missing multi-lang scripts


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/9fad816f
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/9fad816f
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/9fad816f

Branch: refs/heads/0.10.x-branch
Commit: 9fad816fd5840d5ad7d2727db4c7dafe4f1355ba
Parents: f7d2f7f
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Fri May 8 15:23:54 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Fri May 8 15:23:54 2015 -0400

----------------------------------------------------------------------
 .../main/resources/resources/randomsentence.js  | 93 ++++++++++++++++++++
 .../main/resources/resources/splitsentence.py   | 24 +++++
 2 files changed, 117 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/9fad816f/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js
----------------------------------------------------------------------
diff --git a/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js b/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js
new file mode 100644
index 0000000..36fc5f5
--- /dev/null
+++ b/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js
@@ -0,0 +1,93 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+/**
+ * Example for storm spout. Emits random sentences.
+ * The original class in java - storm.starter.spout.RandomSentenceSpout.
+ *
+ */
+
+var storm = require('./storm');
+var Spout = storm.Spout;
+
+
+var SENTENCES = [
+    "the cow jumped over the moon",
+    "an apple a day keeps the doctor away",
+    "four score and seven years ago",
+    "snow white and the seven dwarfs",
+    "i am at two with nature"]
+
+function RandomSentenceSpout(sentences) {
+    Spout.call(this);
+    this.runningTupleId = 0;
+    this.sentences = sentences;
+    this.pending = {};
+};
+
+RandomSentenceSpout.prototype = Object.create(Spout.prototype);
+RandomSentenceSpout.prototype.constructor = RandomSentenceSpout;
+
+RandomSentenceSpout.prototype.getRandomSentence = function() {
+    return this.sentences[getRandomInt(0, this.sentences.length - 1)];
+}
+
+RandomSentenceSpout.prototype.nextTuple = function(done) {
+    var self = this;
+    var sentence = this.getRandomSentence();
+    var tup = [sentence];
+    var id = this.createNextTupleId();
+    this.pending[id] = tup;
+    //This timeout can be removed if TOPOLOGY_SLEEP_SPOUT_WAIT_STRATEGY_TIME_MS is configured to 100
+    setTimeout(function() {
+        self.emit({tuple: tup, id: id}, function(taskIds) {
+            self.log(tup + ' sent to task ids - ' + taskIds);
+        });
+        done();
+    },100);
+}
+
+RandomSentenceSpout.prototype.createNextTupleId = function() {
+    var id = this.runningTupleId;
+    this.runningTupleId++;
+    return id;
+}
+
+RandomSentenceSpout.prototype.ack = function(id, done) {
+    this.log('Received ack for - ' + id);
+    delete this.pending[id];
+    done();
+}
+
+RandomSentenceSpout.prototype.fail = function(id, done) {
+    var self = this;
+    this.log('Received fail for - ' + id + '. Retrying.');
+    this.emit({tuple: this.pending[id], id:id}, function(taskIds) {
+        self.log(self.pending[id] + ' sent to task ids - ' + taskIds);
+    });
+    done();
+}
+
+/**
+ * Returns a random integer between min (inclusive) and max (inclusive)
+ */
+function getRandomInt(min, max) {
+    return Math.floor(Math.random() * (max - min + 1)) + min;
+}
+
+new RandomSentenceSpout(SENTENCES).run();

http://git-wip-us.apache.org/repos/asf/storm/blob/9fad816f/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py
----------------------------------------------------------------------
diff --git a/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py b/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py
new file mode 100644
index 0000000..300105f
--- /dev/null
+++ b/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py
@@ -0,0 +1,24 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+import storm
+
+class SplitSentenceBolt(storm.BasicBolt):
+    def process(self, tup):
+        words = tup.values[0].split(" ")
+        for word in words:
+          storm.emit([word])
+
+SplitSentenceBolt().run()
\ No newline at end of file


[40/50] [abbrv] storm git commit: Merge branch 'java17'

Posted by pt...@apache.org.
Merge branch 'java17'


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/d285d94d
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/d285d94d
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/d285d94d

Branch: refs/heads/0.10.x-branch
Commit: d285d94d418bede3376e743f882c157a83f68a7a
Parents: e7818af fc73600
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue Jun 2 17:15:30 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue Jun 2 17:15:30 2015 -0400

----------------------------------------------------------------------
 pom.xml | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------



[04/50] [abbrv] storm git commit: lock down methods that don't need to be public

Posted by pt...@apache.org.
lock down methods that don't need to be public


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/11b01518
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/11b01518
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/11b01518

Branch: refs/heads/0.10.x-branch
Commit: 11b01518f0935dedfd71d09b697d7189a29b91b5
Parents: df34930
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Thu Apr 2 00:08:12 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Thu Apr 2 00:08:12 2015 -0400

----------------------------------------------------------------------
 .../main/java/org/apache/storm/flux/FluxBuilder.java    | 12 ++++++------
 1 file changed, 6 insertions(+), 6 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/11b01518/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
----------------------------------------------------------------------
diff --git a/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java b/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
index 31b6e64..964c62e 100644
--- a/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
+++ b/flux-core/src/main/java/org/apache/storm/flux/FluxBuilder.java
@@ -61,7 +61,7 @@ public class FluxBuilder {
      * @throws NoSuchMethodException
      * @throws InvocationTargetException
      */
-    public static StormTopology buildTopology(ExecutionContext context) throws IllegalAccessException,
+    static StormTopology buildTopology(ExecutionContext context) throws IllegalAccessException,
             InstantiationException, ClassNotFoundException, NoSuchMethodException, InvocationTargetException {
 
         StormTopology topology = null;
@@ -206,7 +206,7 @@ public class FluxBuilder {
         }
     }
 
-    public static void applyProperties(ObjectDef bean, Object instance, ExecutionContext context) throws
+    private static void applyProperties(ObjectDef bean, Object instance, ExecutionContext context) throws
             IllegalAccessException, InvocationTargetException {
         List<PropertyDef> props = bean.getProperties();
         Class clazz = instance.getClass();
@@ -230,7 +230,7 @@ public class FluxBuilder {
         }
     }
 
-    public static Field findPublicField(Class clazz, String property, Object arg) {
+    private static Field findPublicField(Class clazz, String property, Object arg) {
         Field field = null;
         try {
             field = clazz.getField(property);
@@ -240,7 +240,7 @@ public class FluxBuilder {
         return field;
     }
 
-    public static Method findSetter(Class clazz, String property, Object arg) {
+    private static Method findSetter(Class clazz, String property, Object arg) {
         String setterName = toSetterName(property);
         Method retval = null;
         Method[] methods = clazz.getMethods();
@@ -253,11 +253,11 @@ public class FluxBuilder {
         return retval;
     }
 
-    public static String toSetterName(String name) {
+    private static String toSetterName(String name) {
         return "set" + name.substring(0, 1).toUpperCase() + name.substring(1, name.length());
     }
 
-    public static List<Object> resolveReferences(List<Object> args, ExecutionContext context) {
+    private static List<Object> resolveReferences(List<Object> args, ExecutionContext context) {
         LOG.debug("Checking arguments for references.");
         List<Object> cArgs = new ArrayList<Object>();
         // resolve references


[05/50] [abbrv] storm git commit: when a resource can't be found, log an error and exit rather than throw NPE

Posted by pt...@apache.org.
when a resource can't be found, log an error and exit rather than throw NPE


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/2e44c9e6
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/2e44c9e6
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/2e44c9e6

Branch: refs/heads/0.10.x-branch
Commit: 2e44c9e6a2fb4382950da463fad9f392f8991468
Parents: 11b0151
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Thu Apr 2 12:18:12 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Thu Apr 2 12:18:12 2015 -0400

----------------------------------------------------------------------
 .../src/main/java/org/apache/storm/flux/parser/FluxParser.java   | 4 ++++
 1 file changed, 4 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/2e44c9e6/flux-core/src/main/java/org/apache/storm/flux/parser/FluxParser.java
----------------------------------------------------------------------
diff --git a/flux-core/src/main/java/org/apache/storm/flux/parser/FluxParser.java b/flux-core/src/main/java/org/apache/storm/flux/parser/FluxParser.java
index 78c52d5..72f8a8e 100644
--- a/flux-core/src/main/java/org/apache/storm/flux/parser/FluxParser.java
+++ b/flux-core/src/main/java/org/apache/storm/flux/parser/FluxParser.java
@@ -63,6 +63,10 @@ public class FluxParser {
                                             String propertiesFile, boolean envSub) throws IOException {
         Yaml yaml = yaml();
         InputStream in = FluxParser.class.getResourceAsStream(resource);
+        if(in == null){
+            LOG.error("Unable to load classpath resource: " + resource);
+            System.exit(1);
+        }
         TopologyDef topology = loadYaml(yaml, in, propertiesFile, envSub);
         in.close();
         if(dumpYaml){


[18/50] [abbrv] storm git commit: force source/target to use Java 1.6

Posted by pt...@apache.org.
force source/target to use Java 1.6


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/c48f63e3
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/c48f63e3
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/c48f63e3

Branch: refs/heads/0.10.x-branch
Commit: c48f63e3f361e9416669d4d791de3b3a67c461d0
Parents: 601cee7
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue May 5 15:56:47 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue May 5 15:56:47 2015 -0400

----------------------------------------------------------------------
 flux-core/pom.xml |  2 +-
 pom.xml           | 11 ++++++++++-
 2 files changed, 11 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/c48f63e3/flux-core/pom.xml
----------------------------------------------------------------------
diff --git a/flux-core/pom.xml b/flux-core/pom.xml
index fe2e301..2d03ea4 100644
--- a/flux-core/pom.xml
+++ b/flux-core/pom.xml
@@ -53,7 +53,7 @@
         <dependency>
             <groupId>org.apache.storm</groupId>
             <artifactId>storm-hbase</artifactId>
-            <version>0.11.0-SNAPSHOT</version>
+            <version>${storm.version}</version>
             <scope>test</scope>
         </dependency>
     </dependencies>

http://git-wip-us.apache.org/repos/asf/storm/blob/c48f63e3/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index e79e25a..de48f7b 100644
--- a/pom.xml
+++ b/pom.xml
@@ -111,7 +111,16 @@
 
         </resources>
         <plugins>
-
+            <plugin>
+                <groupId>org.apache.maven.plugins</groupId>
+                <artifactId>maven-compiler-plugin</artifactId>
+                <version>3.3</version>
+                <configuration>
+                    <source>1.6</source>
+                    <target>1.6</target>
+                    <encoding>UTF-8</encoding>
+                </configuration>
+            </plugin>
         </plugins>
     </build>
 </project>


[38/50] [abbrv] storm git commit: Add initialization for ResilientEventHubReceiver

Posted by pt...@apache.org.
Add initialization for ResilientEventHubReceiver

Otherwise, the recover logic will kick in for the first time.

Signed-off-by: Shanyu Zhao <sh...@microsoft.com>


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/9c2972ac
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/9c2972ac
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/9c2972ac

Branch: refs/heads/0.10.x-branch
Commit: 9c2972ac635ef5a7066b1abf19ffd5dc9e42718c
Parents: 86f326a
Author: Shanyu Zhao <sh...@microsoft.com>
Authored: Mon Jun 1 18:23:11 2015 -0700
Committer: Shanyu Zhao <sh...@microsoft.com>
Committed: Mon Jun 1 18:23:11 2015 -0700

----------------------------------------------------------------------
 .../java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java | 1 +
 1 file changed, 1 insertion(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/9c2972ac/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
----------------------------------------------------------------------
diff --git a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
index 7454af4..0fcad99 100755
--- a/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
+++ b/external/storm-eventhubs/src/main/java/org/apache/storm/eventhubs/spout/EventHubReceiverImpl.java
@@ -71,6 +71,7 @@ public class EventHubReceiverImpl implements IEventHubReceiver {
     long start = System.currentTimeMillis();
     receiver = new ResilientEventHubReceiver(connectionString, entityName,
     		partitionId, consumerGroupName, defaultCredits, filter);
+    receiver.initialize();
     
     long end = System.currentTimeMillis();
     logger.info("created eventhub receiver, time taken(ms): " + (end-start));


[41/50] [abbrv] storm git commit: update changelog for STORM-842

Posted by pt...@apache.org.
update changelog for STORM-842


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/a55bbbea
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/a55bbbea
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/a55bbbea

Branch: refs/heads/0.10.x-branch
Commit: a55bbbea85fa83c3e9ab664922580d464de565fe
Parents: d285d94
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Tue Jun 2 17:16:53 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Tue Jun 2 17:16:53 2015 -0400

----------------------------------------------------------------------
 CHANGELOG.md | 1 +
 1 file changed, 1 insertion(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/a55bbbea/CHANGELOG.md
----------------------------------------------------------------------
diff --git a/CHANGELOG.md b/CHANGELOG.md
index f9f7999..6329872 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,6 +1,7 @@
 ## 0.11.0
 
 ## 0.10.0
+ * STORM-842: Drop Support for Java 1.6
  * STORM-835: Netty Client hold batch object until io operation complete
  * STORM-827: Allow AutoTGT to work with storm-hdfs too.
  * STORM-821: Adding connection provider interface to decouple jdbc connector from a single connection pooling implementation.


[08/50] [abbrv] storm git commit: update examples and docs for HDFS example

Posted by pt...@apache.org.
update examples and docs for HDFS example


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/0c1e0aa8
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/0c1e0aa8
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/0c1e0aa8

Branch: refs/heads/0.10.x-branch
Commit: 0c1e0aa81f39e473bcf3482448813d78065e1212
Parents: 3411bc7
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Mon Apr 6 23:48:38 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Mon Apr 6 23:48:38 2015 -0400

----------------------------------------------------------------------
 .../src/test/resources/configs/hdfs_test.yaml   |  97 -----------------
 flux-examples/README.md                         |  30 +++++-
 flux-examples/pom.xml                           |   6 ++
 .../src/main/resources/hdfs_bolt.properties     |   9 ++
 flux-examples/src/main/resources/multilang.yaml |  89 ++++++++++++++++
 flux-examples/src/main/resources/shell.yaml     |  89 ----------------
 .../src/main/resources/simple_hdfs.yaml         | 105 +++++++++++++++++++
 7 files changed, 238 insertions(+), 187 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/0c1e0aa8/flux-core/src/test/resources/configs/hdfs_test.yaml
----------------------------------------------------------------------
diff --git a/flux-core/src/test/resources/configs/hdfs_test.yaml b/flux-core/src/test/resources/configs/hdfs_test.yaml
deleted file mode 100644
index c1d28d2..0000000
--- a/flux-core/src/test/resources/configs/hdfs_test.yaml
+++ /dev/null
@@ -1,97 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-# Test ability to wire together shell spouts/bolts
----
-
-# topology definition
-# name to be used when submitting
-name: "hdfs-topology"
-
-# Components
-# Components are analagous to Spring beans. They are meant to be used as constructor,
-# property(setter), and builder arguments.
-#
-# for the time being, components must be declared in the order they are referenced
-components:
-  - id: "syncPolicy"
-    className: "org.apache.storm.hdfs.bolt.sync.CountSyncPolicy"
-    constructorArgs:
-      - 1000
-  - id: "rotationPolicy"
-    className: "org.apache.storm.hdfs.bolt.rotation.FileSizeRotationPolicy"
-    constructorArgs:
-      - 5.0
-      - MB
-
-  - id: "fileNameFormat"
-    className: "org.apache.storm.hdfs.bolt.format.DefaultFileNameFormat"
-    configMethods:
-      - name: "withPath"
-        args: ["/tmp/foo/"]
-      - name: "withExtension"
-        args: [".txt"]
-
-  - id: "recordFormat"
-    className: "org.apache.storm.hdfs.bolt.format.DelimitedRecordFormat"
-    configMethods:
-      - name: "withFieldDelimiter"
-        args: ["|"]
-
-  - id: "rotationAction"
-    className: "org.apache.storm.hdfs.common.rotation.MoveFileAction"
-    configMethods:
-      - name: "toDestination"
-        args: ["/tmp/dest2"]
-
-# spout definitions
-spouts:
-  - id: "spout-1"
-    className: "backtype.storm.testing.TestWordSpout"
-    parallelism: 1
-    # ...
-
-# bolt definitions
-
-#        HdfsBolt bolt = new HdfsBolt()
-#                .withConfigKey("hdfs.config")
-#                .withFsUrl(args[0])
-#                .withFileNameFormat(fileNameFormat)
-#                .withRecordFormat(format)
-#                .withRotationPolicy(rotationPolicy)
-#                .withSyncPolicy(syncPolicy)
-#                .addRotationAction(new MoveFileAction().toDestination("/tmp/dest2/"));
-bolts:
-  - id: "bolt-1"
-    className: "org.apache.storm.hdfs.bolt.HdfsBolt"
-    configMethods:
-      - name: "withConfigKey"
-        args: ["hdfs.config"]
-      - name: "withFsUrl"
-        args: ["hdfs://localhost:1234"]
-      - name: "withFileNameFormat"
-        args: [ref: "fileNameFormat"]
-      - name: "withRecordFormat"
-        args: [ref: "recordFormat"]
-      - name: "withRotationPolicy"
-        args: [ref: "rotationPolicy"]
-      - name: "withSyncPolicy"
-        args: [ref: "syncPolicy"]
-      - name: "addRotationAction"
-        args: [ref: "rotationAction"]
-    parallelism: 1
-    # ...
-

http://git-wip-us.apache.org/repos/asf/storm/blob/0c1e0aa8/flux-examples/README.md
----------------------------------------------------------------------
diff --git a/flux-examples/README.md b/flux-examples/README.md
index 2f107e7..9f5682e 100644
--- a/flux-examples/README.md
+++ b/flux-examples/README.md
@@ -23,6 +23,34 @@ The example YAML files are also packaged in the examples jar, so they can also b
 command line switch:
 
 ```bash
-storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local --resource /sime_wordcount.yaml
+storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local --resource /simple_wordcount.yaml
+```
+
+## Available Examples
+
+### simple_wordcount.yaml
+
+This is a very basic wordcount example using Java spouts and bolts. It simply logs the running count of each word
+received.
+
+### multilang.yaml
+
+Another wordcount example that uses a spout written in JavaScript (node.js), a bolt written in Python, and two bolts
+written in java.
+
+### kafka_spout.yaml
+This example illustrates how to configure Storm's `storm-kafka` spout using Flux YAML DSL `components`, `references`,
+and `constructor arguments` constructs.
+
+### simple_hdfs.yaml
+
+This example demonstrates using Flux to setup a storm-hdfs bolt to write to an HDFS cluster. It also demonstrates Flux's
+variable substitution/filtering feature.
+
+To run the `simple_hdfs.yaml` example, copy the `hdfs_bolt.properties` file to a convenient location and change, at
+least, the property `hdfs.url` to point to a HDFS cluster. Then you can run the example something like:
+
+```bash
+storm jar ./target/flux-examples-0.2.3-SNAPSHOT.jar org.apache.storm.flux.Flux --local ./src/main/resources/simple_hdfs.yaml --filter my_hdfs_bolt.properties
 ```
 

http://git-wip-us.apache.org/repos/asf/storm/blob/0c1e0aa8/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/flux-examples/pom.xml b/flux-examples/pom.xml
index 63bc312..09db717 100644
--- a/flux-examples/pom.xml
+++ b/flux-examples/pom.xml
@@ -44,6 +44,12 @@
             <artifactId>flux-wrappers</artifactId>
             <version>${project.version}</version>
         </dependency>
+
+        <dependency>
+            <groupId>org.apache.storm</groupId>
+            <artifactId>storm-hdfs</artifactId>
+            <version>${storm.version}</version>
+        </dependency>
     </dependencies>
 
     <build>

http://git-wip-us.apache.org/repos/asf/storm/blob/0c1e0aa8/flux-examples/src/main/resources/hdfs_bolt.properties
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/hdfs_bolt.properties b/flux-examples/src/main/resources/hdfs_bolt.properties
new file mode 100644
index 0000000..34a7a23
--- /dev/null
+++ b/flux-examples/src/main/resources/hdfs_bolt.properties
@@ -0,0 +1,9 @@
+# The HDFS url
+hdfs.url="hdfs://hadoop:54310"
+
+# The HDFS directory where the bolt will write incoming data
+hdfs.write.dir="/incoming"
+
+# The HDFS directory where files will be moved once the bolt has
+# finished writing to it.
+hdfs.dest.dir="/complete"
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/0c1e0aa8/flux-examples/src/main/resources/multilang.yaml
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/multilang.yaml b/flux-examples/src/main/resources/multilang.yaml
new file mode 100644
index 0000000..4f80667
--- /dev/null
+++ b/flux-examples/src/main/resources/multilang.yaml
@@ -0,0 +1,89 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Test ability to wire together shell spouts/bolts
+---
+
+# topology definition
+# name to be used when submitting
+name: "shell-topology"
+
+# topology configuration
+# this will be passed to the submitter as a map of config options
+#
+config:
+  topology.workers: 1
+  # ...
+
+# spout definitions
+spouts:
+  - id: "sentence-spout"
+    className: "org.apache.storm.flux.wrappers.spouts.FluxShellSpout"
+    # shell spout constructor takes 2 arguments: String[], String[]
+    constructorArgs:
+      # command line
+      - ["node", "randomsentence.js"]
+      # output fields
+      - ["word"]
+    parallelism: 1
+    # ...
+
+# bolt definitions
+bolts:
+  - id: "splitsentence"
+    className: "org.apache.storm.flux.wrappers.bolts.FluxShellBolt"
+    constructorArgs:
+      # command line
+      - ["python", "splitsentence.py"]
+      # output fields
+      - ["word"]
+    parallelism: 1
+    # ...
+
+  - id: "log"
+    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
+    parallelism: 1
+    # ...
+
+  - id: "count"
+    className: "backtype.storm.testing.TestWordCounter"
+    parallelism: 1
+    # ...
+
+#stream definitions
+# stream definitions define connections between spouts and bolts.
+# note that such connections can be cyclical
+# custom stream groupings are also supported
+
+streams:
+  - name: "spout --> split" # name isn't used (placeholder for logging, UI, etc.)
+    from: "sentence-spout"
+    to: "splitsentence"
+    grouping:
+      type: SHUFFLE
+
+  - name: "split --> count"
+    from: "splitsentence"
+    to: "count"
+    grouping:
+      type: FIELDS
+      args: ["word"]
+
+  - name: "count --> log"
+    from: "count"
+    to: "log"
+    grouping:
+      type: SHUFFLE

http://git-wip-us.apache.org/repos/asf/storm/blob/0c1e0aa8/flux-examples/src/main/resources/shell.yaml
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/shell.yaml b/flux-examples/src/main/resources/shell.yaml
deleted file mode 100644
index 4f80667..0000000
--- a/flux-examples/src/main/resources/shell.yaml
+++ /dev/null
@@ -1,89 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-# Test ability to wire together shell spouts/bolts
----
-
-# topology definition
-# name to be used when submitting
-name: "shell-topology"
-
-# topology configuration
-# this will be passed to the submitter as a map of config options
-#
-config:
-  topology.workers: 1
-  # ...
-
-# spout definitions
-spouts:
-  - id: "sentence-spout"
-    className: "org.apache.storm.flux.wrappers.spouts.FluxShellSpout"
-    # shell spout constructor takes 2 arguments: String[], String[]
-    constructorArgs:
-      # command line
-      - ["node", "randomsentence.js"]
-      # output fields
-      - ["word"]
-    parallelism: 1
-    # ...
-
-# bolt definitions
-bolts:
-  - id: "splitsentence"
-    className: "org.apache.storm.flux.wrappers.bolts.FluxShellBolt"
-    constructorArgs:
-      # command line
-      - ["python", "splitsentence.py"]
-      # output fields
-      - ["word"]
-    parallelism: 1
-    # ...
-
-  - id: "log"
-    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
-    parallelism: 1
-    # ...
-
-  - id: "count"
-    className: "backtype.storm.testing.TestWordCounter"
-    parallelism: 1
-    # ...
-
-#stream definitions
-# stream definitions define connections between spouts and bolts.
-# note that such connections can be cyclical
-# custom stream groupings are also supported
-
-streams:
-  - name: "spout --> split" # name isn't used (placeholder for logging, UI, etc.)
-    from: "sentence-spout"
-    to: "splitsentence"
-    grouping:
-      type: SHUFFLE
-
-  - name: "split --> count"
-    from: "splitsentence"
-    to: "count"
-    grouping:
-      type: FIELDS
-      args: ["word"]
-
-  - name: "count --> log"
-    from: "count"
-    to: "log"
-    grouping:
-      type: SHUFFLE

http://git-wip-us.apache.org/repos/asf/storm/blob/0c1e0aa8/flux-examples/src/main/resources/simple_hdfs.yaml
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/simple_hdfs.yaml b/flux-examples/src/main/resources/simple_hdfs.yaml
new file mode 100644
index 0000000..ea7721d
--- /dev/null
+++ b/flux-examples/src/main/resources/simple_hdfs.yaml
@@ -0,0 +1,105 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Test ability to wire together shell spouts/bolts
+---
+
+# topology definition
+# name to be used when submitting
+name: "hdfs-topology"
+
+# Components
+# Components are analagous to Spring beans. They are meant to be used as constructor,
+# property(setter), and builder arguments.
+#
+# for the time being, components must be declared in the order they are referenced
+components:
+  - id: "syncPolicy"
+    className: "org.apache.storm.hdfs.bolt.sync.CountSyncPolicy"
+    constructorArgs:
+      - 1000
+  - id: "rotationPolicy"
+    className: "org.apache.storm.hdfs.bolt.rotation.TimedRotationPolicy"
+    constructorArgs:
+      - 30
+      - SECONDS
+
+  - id: "fileNameFormat"
+    className: "org.apache.storm.hdfs.bolt.format.DefaultFileNameFormat"
+    configMethods:
+      - name: "withPath"
+        args: [${hdfs.write.dir}]
+      - name: "withExtension"
+        args: [".txt"]
+
+  - id: "recordFormat"
+    className: "org.apache.storm.hdfs.bolt.format.DelimitedRecordFormat"
+    configMethods:
+      - name: "withFieldDelimiter"
+        args: ["|"]
+
+  - id: "rotationAction"
+    className: "org.apache.storm.hdfs.common.rotation.MoveFileAction"
+    configMethods:
+      - name: "toDestination"
+        args: [${hdfs.dest.dir}]
+
+# spout definitions
+spouts:
+  - id: "spout-1"
+    className: "backtype.storm.testing.TestWordSpout"
+    parallelism: 1
+    # ...
+
+# bolt definitions
+
+bolts:
+  - id: "bolt-1"
+    className: "org.apache.storm.hdfs.bolt.HdfsBolt"
+    configMethods:
+      - name: "withConfigKey"
+        args: ["hdfs.config"]
+      - name: "withFsUrl"
+        args: [${hdfs.url}]
+      - name: "withFileNameFormat"
+        args: [ref: "fileNameFormat"]
+      - name: "withRecordFormat"
+        args: [ref: "recordFormat"]
+      - name: "withRotationPolicy"
+        args: [ref: "rotationPolicy"]
+      - name: "withSyncPolicy"
+        args: [ref: "syncPolicy"]
+      - name: "addRotationAction"
+        args: [ref: "rotationAction"]
+    parallelism: 1
+    # ...
+
+  - id: "bolt-2"
+    className: "org.apache.storm.flux.wrappers.bolts.LogInfoBolt"
+    parallelism: 1
+
+streams:
+  - name: "" # name isn't used (placeholder for logging, UI, etc.)
+    from: "spout-1"
+    to: "bolt-1"
+    grouping:
+      type: SHUFFLE
+
+  - name: "" # name isn't used (placeholder for logging, UI, etc.)
+    from: "spout-1"
+    to: "bolt-2"
+    grouping:
+      type: SHUFFLE
\ No newline at end of file


[45/50] [abbrv] storm git commit: add STORM-561 to changelog

Posted by pt...@apache.org.
add STORM-561 to changelog


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/285d943b
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/285d943b
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/285d943b

Branch: refs/heads/0.10.x-branch
Commit: 285d943b87f5037f0eaf3d45af64622f13017dea
Parents: cb370a9
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Jun 3 13:24:03 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Jun 3 13:24:03 2015 -0400

----------------------------------------------------------------------
 CHANGELOG.md | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/285d943b/CHANGELOG.md
----------------------------------------------------------------------
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 02aa95d..aa390f1 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,11 +1,11 @@
 ## 0.11.0
 
 ## 0.10.0
- * STORM-818: storm-eventhubs configuration improvement and refactoring
  * STORM-842: Drop Support for Java 1.6
  * STORM-835: Netty Client hold batch object until io operation complete
  * STORM-827: Allow AutoTGT to work with storm-hdfs too.
  * STORM-821: Adding connection provider interface to decouple jdbc connector from a single connection pooling implementation.
+ * STORM-818: storm-eventhubs configuration improvement and refactoring
  * STORM-816: maven-gpg-plugin does not work with gpg 2.1
  * STORM-811: remove old metastor_db before running tests again.
  * STORM-808: allow null to be parsed as null
@@ -124,6 +124,7 @@
  * STORM-567: Move Storm Documentation/Website from SVN to git
  * STORM-565: Fix NPE when topology.groups is null.
  * STORM-563: Kafka Spout doesn't pick up from the beginning of the queue unless forceFromStart specified.
+ * STORM-561: Add flux as an external module
  * STORM-557: High Quality Images for presentations
  * STORM-554: the type of first param "topology" should be ^StormTopology not ^TopologyContext
  * STORM-552: Add netty socket backlog config


[27/50] [abbrv] storm git commit: integrate flux with Storm build

Posted by pt...@apache.org.
integrate flux with Storm build


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/2094a08e
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/2094a08e
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/2094a08e

Branch: refs/heads/0.10.x-branch
Commit: 2094a08ea6ccdf81126c103930d9ec3e77fcd5c1
Parents: b21a98d
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed May 6 14:13:13 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed May 6 14:13:13 2015 -0400

----------------------------------------------------------------------
 external/flux/README.md                         |  15 +-
 external/flux/flux-core/pom.xml                 |  13 +-
 external/flux/flux-examples/pom.xml             |  13 +-
 external/flux/flux-wrappers/pom.xml             |  24 +-
 .../main/resources/resources/randomsentence.js  |  93 -----
 .../main/resources/resources/splitsentence.py   |  24 --
 .../src/main/resources/resources/storm.js       | 373 -------------------
 .../src/main/resources/resources/storm.py       | 260 -------------
 external/flux/pom.xml                           |  19 +-
 pom.xml                                         |   1 +
 10 files changed, 41 insertions(+), 794 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/external/flux/README.md
----------------------------------------------------------------------
diff --git a/external/flux/README.md b/external/flux/README.md
index 6f27219..d09a73c 100644
--- a/external/flux/README.md
+++ b/external/flux/README.md
@@ -829,17 +829,6 @@ topologySource:
   methodName: "getTopologyWithDifferentMethodName"
 ```
 
-## Author
-P. Taylor Goetz
+## Committer Sponsors
 
-## Contributors
-
-
-## Contributing
-
-Contributions in any form are more than welcome.
-
-The intent of this project is that it will be donated to Apache Storm.
-
-By offering any contributions to this project, you should be willing and able to submit an
-[Apache ICLA](http://www.apache.org/licenses/icla.txt), if you have not done so already.
\ No newline at end of file
+ * P. Taylor Goetz ([ptgoetz@apache.org](mailto:ptgoetz@apache.org))
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/external/flux/flux-core/pom.xml
----------------------------------------------------------------------
diff --git a/external/flux/flux-core/pom.xml b/external/flux/flux-core/pom.xml
index 600613d..c3842bd 100644
--- a/external/flux/flux-core/pom.xml
+++ b/external/flux/flux-core/pom.xml
@@ -19,13 +19,12 @@
     <modelVersion>4.0.0</modelVersion>
 
     <parent>
-        <groupId>com.github.ptgoetz</groupId>
+        <groupId>org.apache.storm</groupId>
         <artifactId>flux</artifactId>
-        <version>0.3.1-SNAPSHOT</version>
+        <version>0.11.0-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 
-    <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux-core</artifactId>
     <packaging>jar</packaging>
 
@@ -34,26 +33,26 @@
 
     <dependencies>
         <dependency>
-            <groupId>com.github.ptgoetz</groupId>
+            <groupId>org.apache.storm</groupId>
             <artifactId>flux-wrappers</artifactId>
             <version>${project.version}</version>
         </dependency>
         <dependency>
             <groupId>org.apache.storm</groupId>
             <artifactId>storm-kafka</artifactId>
-            <version>${storm.version}</version>
+            <version>${project.version}</version>
             <scope>test</scope>
         </dependency>
         <dependency>
             <groupId>org.apache.storm</groupId>
             <artifactId>storm-hdfs</artifactId>
-            <version>${storm.version}</version>
+            <version>${project.version}</version>
             <scope>test</scope>
         </dependency>
         <dependency>
             <groupId>org.apache.storm</groupId>
             <artifactId>storm-hbase</artifactId>
-            <version>${storm.version}</version>
+            <version>${project.version}</version>
             <scope>test</scope>
         </dependency>
     </dependencies>

http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/external/flux/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/external/flux/flux-examples/pom.xml b/external/flux/flux-examples/pom.xml
index 0b9796e..709b20b 100644
--- a/external/flux/flux-examples/pom.xml
+++ b/external/flux/flux-examples/pom.xml
@@ -19,13 +19,12 @@
     <modelVersion>4.0.0</modelVersion>
 
     <parent>
-        <groupId>com.github.ptgoetz</groupId>
+        <groupId>org.apache.storm</groupId>
         <artifactId>flux</artifactId>
-        <version>0.3.1-SNAPSHOT</version>
+        <version>0.11.0-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 
-    <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux-examples</artifactId>
     <packaging>jar</packaging>
 
@@ -34,12 +33,12 @@
 
     <dependencies>
         <dependency>
-            <groupId>com.github.ptgoetz</groupId>
+            <groupId>org.apache.storm</groupId>
             <artifactId>flux-core</artifactId>
             <version>${project.version}</version>
         </dependency>
         <dependency>
-            <groupId>com.github.ptgoetz</groupId>
+            <groupId>org.apache.storm</groupId>
             <artifactId>flux-wrappers</artifactId>
             <version>${project.version}</version>
         </dependency>
@@ -47,12 +46,12 @@
         <dependency>
             <groupId>org.apache.storm</groupId>
             <artifactId>storm-hdfs</artifactId>
-            <version>${storm.version}</version>
+            <version>${project.version}</version>
         </dependency>
         <dependency>
             <groupId>org.apache.storm</groupId>
             <artifactId>storm-hbase</artifactId>
-            <version>${storm.version}</version>
+            <version>${project.version}</version>
         </dependency>
     </dependencies>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/external/flux/flux-wrappers/pom.xml
----------------------------------------------------------------------
diff --git a/external/flux/flux-wrappers/pom.xml b/external/flux/flux-wrappers/pom.xml
index 6784141..be042ff 100644
--- a/external/flux/flux-wrappers/pom.xml
+++ b/external/flux/flux-wrappers/pom.xml
@@ -19,17 +19,33 @@
     <modelVersion>4.0.0</modelVersion>
 
     <parent>
-        <groupId>com.github.ptgoetz</groupId>
+        <groupId>org.apache.storm</groupId>
         <artifactId>flux</artifactId>
-        <version>0.3.1-SNAPSHOT</version>
+        <version>0.11.0-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 
-    <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux-wrappers</artifactId>
     <packaging>jar</packaging>
 
     <name>flux-wrappers</name>
-    <url>https://github.com/ptgoetz/flux</url>
+
+    <dependencies>
+        <dependency>
+            <groupId>org.apache.storm</groupId>
+            <artifactId>multilang-javascript</artifactId>
+            <version>${project.version}</version>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.storm</groupId>
+            <artifactId>multilang-ruby</artifactId>
+            <version>${project.version}</version>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.storm</groupId>
+            <artifactId>multilang-python</artifactId>
+            <version>${project.version}</version>
+        </dependency>
+    </dependencies>
 
 </project>

http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js
----------------------------------------------------------------------
diff --git a/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js b/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js
deleted file mode 100644
index 36fc5f5..0000000
--- a/external/flux/flux-wrappers/src/main/resources/resources/randomsentence.js
+++ /dev/null
@@ -1,93 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-/**
- * Example for storm spout. Emits random sentences.
- * The original class in java - storm.starter.spout.RandomSentenceSpout.
- *
- */
-
-var storm = require('./storm');
-var Spout = storm.Spout;
-
-
-var SENTENCES = [
-    "the cow jumped over the moon",
-    "an apple a day keeps the doctor away",
-    "four score and seven years ago",
-    "snow white and the seven dwarfs",
-    "i am at two with nature"]
-
-function RandomSentenceSpout(sentences) {
-    Spout.call(this);
-    this.runningTupleId = 0;
-    this.sentences = sentences;
-    this.pending = {};
-};
-
-RandomSentenceSpout.prototype = Object.create(Spout.prototype);
-RandomSentenceSpout.prototype.constructor = RandomSentenceSpout;
-
-RandomSentenceSpout.prototype.getRandomSentence = function() {
-    return this.sentences[getRandomInt(0, this.sentences.length - 1)];
-}
-
-RandomSentenceSpout.prototype.nextTuple = function(done) {
-    var self = this;
-    var sentence = this.getRandomSentence();
-    var tup = [sentence];
-    var id = this.createNextTupleId();
-    this.pending[id] = tup;
-    //This timeout can be removed if TOPOLOGY_SLEEP_SPOUT_WAIT_STRATEGY_TIME_MS is configured to 100
-    setTimeout(function() {
-        self.emit({tuple: tup, id: id}, function(taskIds) {
-            self.log(tup + ' sent to task ids - ' + taskIds);
-        });
-        done();
-    },100);
-}
-
-RandomSentenceSpout.prototype.createNextTupleId = function() {
-    var id = this.runningTupleId;
-    this.runningTupleId++;
-    return id;
-}
-
-RandomSentenceSpout.prototype.ack = function(id, done) {
-    this.log('Received ack for - ' + id);
-    delete this.pending[id];
-    done();
-}
-
-RandomSentenceSpout.prototype.fail = function(id, done) {
-    var self = this;
-    this.log('Received fail for - ' + id + '. Retrying.');
-    this.emit({tuple: this.pending[id], id:id}, function(taskIds) {
-        self.log(self.pending[id] + ' sent to task ids - ' + taskIds);
-    });
-    done();
-}
-
-/**
- * Returns a random integer between min (inclusive) and max (inclusive)
- */
-function getRandomInt(min, max) {
-    return Math.floor(Math.random() * (max - min + 1)) + min;
-}
-
-new RandomSentenceSpout(SENTENCES).run();

http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py
----------------------------------------------------------------------
diff --git a/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py b/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py
deleted file mode 100644
index 300105f..0000000
--- a/external/flux/flux-wrappers/src/main/resources/resources/splitsentence.py
+++ /dev/null
@@ -1,24 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-import storm
-
-class SplitSentenceBolt(storm.BasicBolt):
-    def process(self, tup):
-        words = tup.values[0].split(" ")
-        for word in words:
-          storm.emit([word])
-
-SplitSentenceBolt().run()
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/external/flux/flux-wrappers/src/main/resources/resources/storm.js
----------------------------------------------------------------------
diff --git a/external/flux/flux-wrappers/src/main/resources/resources/storm.js b/external/flux/flux-wrappers/src/main/resources/resources/storm.js
deleted file mode 100755
index 355c2d2..0000000
--- a/external/flux/flux-wrappers/src/main/resources/resources/storm.js
+++ /dev/null
@@ -1,373 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-/**
- * Base classes in node-js for storm Bolt and Spout.
- * Implements the storm multilang protocol for nodejs.
- */
-
-
-var fs = require('fs');
-
-function Storm() {
-    this.messagePart = "";
-    this.taskIdsCallbacks = [];
-    this.isFirstMessage = true;
-    this.separator = '\nend\n';
-}
-
-Storm.prototype.sendMsgToParent = function(msg) {
-    var str = JSON.stringify(msg);
-    process.stdout.write(str + this.separator);
-}
-
-Storm.prototype.sync = function() {
-    this.sendMsgToParent({"command":"sync"});
-}
-
-Storm.prototype.sendPid = function(heartbeatdir) {
-    var pid = process.pid;
-    fs.closeSync(fs.openSync(heartbeatdir + "/" + pid, "w"));
-    this.sendMsgToParent({"pid": pid})
-}
-
-Storm.prototype.log = function(msg) {
-    this.sendMsgToParent({"command": "log", "msg": msg});
-}
-
-Storm.prototype.initSetupInfo = function(setupInfo) {
-    var self = this;
-    var callback = function() {
-        self.sendPid(setupInfo['pidDir']);
-    }
-    this.initialize(setupInfo['conf'], setupInfo['context'], callback);
-}
-
-Storm.prototype.startReadingInput = function() {
-    var self = this;
-    process.stdin.on('readable', function() {
-        var chunk = process.stdin.read();
-        var messages = self.handleNewChunk(chunk);
-        messages.forEach(function(message) {
-            self.handleNewMessage(message);
-        })
-
-    });
-}
-
-/**
- * receives a new string chunk and returns a list of new messages with the separator removed
- * stores state in this.messagePart
- * @param chunk
- */
-Storm.prototype.handleNewChunk = function(chunk) {
-    //invariant: this.messagePart has no separator otherwise we would have parsed it already
-    var messages = [];
-    if (chunk && chunk.length !== 0) {
-        //"{}".split("\nend\n")           ==> ['{}']
-        //"\nend\n".split("\nend\n")      ==> [''  , '']
-        //"{}\nend\n".split("\nend\n")    ==> ['{}', '']
-        //"\nend\n{}".split("\nend\n")    ==> [''  , '{}']
-        // "{}\nend\n{}".split("\nend\n") ==> ['{}', '{}' ]
-        this.messagePart = this.messagePart + chunk;
-        var newMessageParts = this.messagePart.split(this.separator);
-        while (newMessageParts.length > 0) {
-            var potentialMessage = newMessageParts.shift();
-            var anotherMessageAhead = newMessageParts.length > 0;
-            if  (!anotherMessageAhead) {
-                this.messagePart = potentialMessage;
-            }
-            else if (potentialMessage.length > 0) {
-                messages.push(potentialMessage);
-            }
-        }
-    }
-    return messages;
-}
-
-Storm.prototype.isTaskIds = function(msg) {
-    return (msg instanceof Array);
-}
-
-Storm.prototype.handleNewMessage = function(msg) {
-    var parsedMsg = JSON.parse(msg);
-
-    if (this.isFirstMessage) {
-        this.initSetupInfo(parsedMsg);
-        this.isFirstMessage = false;
-    } else if (this.isTaskIds(parsedMsg)) {
-        this.handleNewTaskId(parsedMsg);
-    } else {
-        this.handleNewCommand(parsedMsg);
-    }
-}
-
-Storm.prototype.handleNewTaskId = function(taskIds) {
-    //When new list of task ids arrives, the callback that was passed with the corresponding emit should be called.
-    //Storm assures that the task ids will be sent in the same order as their corresponding emits so it we can simply
-    //take the first callback in the list and be sure it is the right one.
-
-    var callback = this.taskIdsCallbacks.shift();
-    if (callback) {
-        callback(taskIds);
-    } else {
-        throw new Error('Something went wrong, we off the split of task id callbacks');
-    }
-}
-
-
-
-/**
- *
- * @param messageDetails json with the emit details.
- *
- * For bolt, the json must contain the required fields:
- * - tuple - the value to emit
- * - anchorTupleId - the value of the anchor tuple (the input tuple that lead to this emit). Used to track the source
- * tuple and return ack when all components successfully finished to process it.
- * and may contain the optional fields:
- * - stream (if empty - emit to default stream)
- *
- * For spout, the json must contain the required fields:
- * - tuple - the value to emit
- *
- * and may contain the optional fields:
- * - id - pass id for reliable emit (and receive ack/fail later).
- * - stream - if empty - emit to default stream.
- *
- * @param onTaskIds function than will be called with list of task ids the message was emitted to (when received).
- */
-Storm.prototype.emit = function(messageDetails, onTaskIds) {
-    //Every emit triggers a response - list of task ids to which the tuple was emitted. The task ids are accessible
-    //through the callback (will be called when the response arrives). The callback is stored in a list until the
-    //corresponding task id list arrives.
-    if (messageDetails.task) {
-        throw new Error('Illegal input - task. To emit to specific task use emit direct!');
-    }
-
-    if (!onTaskIds) {
-        throw new Error('You must pass a onTaskIds callback when using emit!')
-    }
-
-    this.taskIdsCallbacks.push(onTaskIds);
-    this.__emit(messageDetails);;
-}
-
-
-/**
- * Emit message to specific task.
- * @param messageDetails json with the emit details.
- *
- * For bolt, the json must contain the required fields:
- * - tuple - the value to emit
- * - anchorTupleId - the value of the anchor tuple (the input tuple that lead to this emit). Used to track the source
- * tuple and return ack when all components successfully finished to process it.
- * - task - indicate the task to send the tuple to.
- * and may contain the optional fields:
- * - stream (if empty - emit to default stream)
- *
- * For spout, the json must contain the required fields:
- * - tuple - the value to emit
- * - task - indicate the task to send the tuple to.
- * and may contain the optional fields:
- * - id - pass id for reliable emit (and receive ack/fail later).
- * - stream - if empty - emit to default stream.
- *
- * @param onTaskIds function than will be called with list of task ids the message was emitted to (when received).
- */
-Storm.prototype.emitDirect = function(commandDetails) {
-    if (!commandDetails.task) {
-        throw new Error("Emit direct must receive task id!")
-    }
-    this.__emit(commandDetails);
-}
-
-/**
- * Initialize storm component according to the configuration received.
- * @param conf configuration object accrding to storm protocol.
- * @param context context object according to storm protocol.
- * @param done callback. Call this method when finished initializing.
- */
-Storm.prototype.initialize = function(conf, context, done) {
-    done();
-}
-
-Storm.prototype.run = function() {
-    process.stdout.setEncoding('utf8');
-    process.stdin.setEncoding('utf8');
-    this.startReadingInput();
-}
-
-function Tuple(id, component, stream, task, values) {
-    this.id = id;
-    this.component = component;
-    this.stream = stream;
-    this.task = task;
-    this.values = values;
-}
-
-/**
- * Base class for storm bolt.
- * To create a bolt implement 'process' method.
- * You may also implement initialize method to
- */
-function BasicBolt() {
-    Storm.call(this);
-    this.anchorTuple = null;
-};
-
-BasicBolt.prototype = Object.create(Storm.prototype);
-BasicBolt.prototype.constructor = BasicBolt;
-
-/**
- * Emit message.
- * @param commandDetails json with the required fields:
- * - tuple - the value to emit
- * - anchorTupleId - the value of the anchor tuple (the input tuple that lead to this emit). Used to track the source
- * tuple and return ack when all components successfully finished to process it.
- * and the optional fields:
- * - stream (if empty - emit to default stream)
- * - task (pass only to emit to specific task)
- */
-BasicBolt.prototype.__emit = function(commandDetails) {
-    var self = this;
-
-    var message = {
-        command: "emit",
-        tuple: commandDetails.tuple,
-        stream: commandDetails.stream,
-        task: commandDetails.task,
-        anchors: [commandDetails.anchorTupleId]
-    };
-
-    this.sendMsgToParent(message);
-}
-
-BasicBolt.prototype.handleNewCommand = function(command) {
-    var self = this;
-    var tup = new Tuple(command["id"], command["comp"], command["stream"], command["task"], command["tuple"]);
-
-    if (tup.task === -1 && tup.stream === "__heartbeat") {
-        self.sync();
-        return;
-    }
-
-    var callback = function(err) {
-        if (err) {
-            self.fail(tup, err);
-            return;
-        }
-        self.ack(tup);
-    }
-    this.process(tup, callback);
-}
-
-/**
- * Implement this method when creating a bolt. This is the main method that provides the logic of the bolt (what
- * should it do?).
- * @param tuple the input of the bolt - what to process.
- * @param done call this method when done processing.
- */
-BasicBolt.prototype.process = function(tuple, done) {};
-
-BasicBolt.prototype.ack = function(tup) {
-    this.sendMsgToParent({"command": "ack", "id": tup.id});
-}
-
-BasicBolt.prototype.fail = function(tup, err) {
-    this.sendMsgToParent({"command": "fail", "id": tup.id});
-}
-
-
-/**
- * Base class for storm spout.
- * To create a spout implement the following methods: nextTuple, ack and fail (nextTuple - mandatory, ack and fail
- * can stay empty).
- * You may also implement initialize method.
- *
- */
-function Spout() {
-    Storm.call(this);
-};
-
-Spout.prototype = Object.create(Storm.prototype);
-
-Spout.prototype.constructor = Spout;
-
-/**
- * This method will be called when an ack is received for preciously sent tuple. One may implement it.
- * @param id The id of the tuple.
- * @param done Call this method when finished and ready to receive more tuples.
- */
-Spout.prototype.ack = function(id, done) {};
-
-/**
- * This method will be called when an fail is received for preciously sent tuple. One may implement it (for example -
- * log the failure or send the tuple again).
- * @param id The id of the tuple.
- * @param done Call this method when finished and ready to receive more tuples.
- */
-Spout.prototype.fail = function(id, done) {};
-
-/**
- * Method the indicates its time to emit the next tuple.
- * @param done call this method when done sending the output.
- */
-Spout.prototype.nextTuple = function(done) {};
-
-Spout.prototype.handleNewCommand = function(command) {
-    var self = this;
-    var callback = function() {
-        self.sync();
-    }
-
-    if (command["command"] === "next") {
-        this.nextTuple(callback);
-    }
-
-    if (command["command"] === "ack") {
-        this.ack(command["id"], callback);
-    }
-
-    if (command["command"] === "fail") {
-        this.fail(command["id"], callback);
-    }
-}
-
-/**
- * @param commandDetails json with the required fields:
- * - tuple - the value to emit.
- * and the optional fields:
- * - id - pass id for reliable emit (and receive ack/fail later).
- * - stream - if empty - emit to default stream.
- * - task - pass only to emit to specific task.
- */
-Spout.prototype.__emit = function(commandDetails) {
-    var message = {
-        command: "emit",
-        tuple: commandDetails.tuple,
-        id: commandDetails.id,
-        stream: commandDetails.stream,
-        task: commandDetails.task
-    };
-
-    this.sendMsgToParent(message);
-}
-
-module.exports.BasicBolt = BasicBolt;
-module.exports.Spout = Spout;

http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/external/flux/flux-wrappers/src/main/resources/resources/storm.py
----------------------------------------------------------------------
diff --git a/external/flux/flux-wrappers/src/main/resources/resources/storm.py b/external/flux/flux-wrappers/src/main/resources/resources/storm.py
deleted file mode 100644
index 642c393..0000000
--- a/external/flux/flux-wrappers/src/main/resources/resources/storm.py
+++ /dev/null
@@ -1,260 +0,0 @@
-# -*- coding: utf-8 -*-
-
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import sys
-import os
-import traceback
-from collections import deque
-
-try:
-    import simplejson as json
-except ImportError:
-    import json
-
-json_encode = lambda x: json.dumps(x)
-json_decode = lambda x: json.loads(x)
-
-#reads lines and reconstructs newlines appropriately
-def readMsg():
-    msg = ""
-    while True:
-        line = sys.stdin.readline()
-        if not line:
-            raise Exception('Read EOF from stdin')
-        if line[0:-1] == "end":
-            break
-        msg = msg + line
-    return json_decode(msg[0:-1])
-
-MODE = None
-ANCHOR_TUPLE = None
-
-#queue up commands we read while trying to read taskids
-pending_commands = deque()
-
-def readTaskIds():
-    if pending_taskids:
-        return pending_taskids.popleft()
-    else:
-        msg = readMsg()
-        while type(msg) is not list:
-            pending_commands.append(msg)
-            msg = readMsg()
-        return msg
-
-#queue up taskids we read while trying to read commands/tuples
-pending_taskids = deque()
-
-def readCommand():
-    if pending_commands:
-        return pending_commands.popleft()
-    else:
-        msg = readMsg()
-        while type(msg) is list:
-            pending_taskids.append(msg)
-            msg = readMsg()
-        return msg
-
-def readTuple():
-    cmd = readCommand()
-    return Tuple(cmd["id"], cmd["comp"], cmd["stream"], cmd["task"], cmd["tuple"])
-
-def sendMsgToParent(msg):
-    print json_encode(msg)
-    print "end"
-    sys.stdout.flush()
-
-def sync():
-    sendMsgToParent({'command':'sync'})
-
-def sendpid(heartbeatdir):
-    pid = os.getpid()
-    sendMsgToParent({'pid':pid})
-    open(heartbeatdir + "/" + str(pid), "w").close()
-
-def emit(*args, **kwargs):
-    __emit(*args, **kwargs)
-    return readTaskIds()
-
-def emitDirect(task, *args, **kwargs):
-    kwargs["directTask"] = task
-    __emit(*args, **kwargs)
-
-def __emit(*args, **kwargs):
-    global MODE
-    if MODE == Bolt:
-        emitBolt(*args, **kwargs)
-    elif MODE == Spout:
-        emitSpout(*args, **kwargs)
-
-def emitBolt(tup, stream=None, anchors = [], directTask=None):
-    global ANCHOR_TUPLE
-    if ANCHOR_TUPLE is not None:
-        anchors = [ANCHOR_TUPLE]
-    m = {"command": "emit"}
-    if stream is not None:
-        m["stream"] = stream
-    m["anchors"] = map(lambda a: a.id, anchors)
-    if directTask is not None:
-        m["task"] = directTask
-    m["tuple"] = tup
-    sendMsgToParent(m)
-
-def emitSpout(tup, stream=None, id=None, directTask=None):
-    m = {"command": "emit"}
-    if id is not None:
-        m["id"] = id
-    if stream is not None:
-        m["stream"] = stream
-    if directTask is not None:
-        m["task"] = directTask
-    m["tuple"] = tup
-    sendMsgToParent(m)
-
-def ack(tup):
-    sendMsgToParent({"command": "ack", "id": tup.id})
-
-def fail(tup):
-    sendMsgToParent({"command": "fail", "id": tup.id})
-
-def reportError(msg):
-    sendMsgToParent({"command": "error", "msg": msg})
-
-def log(msg, level=2):
-    sendMsgToParent({"command": "log", "msg": msg, "level":level})
-
-def logTrace(msg):
-    log(msg, 0)
-
-def logDebug(msg):
-    log(msg, 1)
-
-def logInfo(msg):
-    log(msg, 2)
-
-def logWarn(msg):
-    log(msg, 3)
-
-def logError(msg):
-    log(msg, 4)
-
-def rpcMetrics(name, params):
-    sendMsgToParent({"command": "metrics", "name": name, "params": params})
-
-def initComponent():
-    setupInfo = readMsg()
-    sendpid(setupInfo['pidDir'])
-    return [setupInfo['conf'], setupInfo['context']]
-
-class Tuple(object):
-    def __init__(self, id, component, stream, task, values):
-        self.id = id
-        self.component = component
-        self.stream = stream
-        self.task = task
-        self.values = values
-
-    def __repr__(self):
-        return '<%s%s>' % (
-            self.__class__.__name__,
-            ''.join(' %s=%r' % (k, self.__dict__[k]) for k in sorted(self.__dict__.keys())))
-
-    def is_heartbeat_tuple(self):
-        return self.task == -1 and self.stream == "__heartbeat"
-
-class Bolt(object):
-    def initialize(self, stormconf, context):
-        pass
-
-    def process(self, tuple):
-        pass
-
-    def run(self):
-        global MODE
-        MODE = Bolt
-        conf, context = initComponent()
-        try:
-            self.initialize(conf, context)
-            while True:
-                tup = readTuple()
-                if tup.is_heartbeat_tuple():
-                    sync()
-                else:
-                    self.process(tup)
-        except Exception, e:
-            reportError(traceback.format_exc(e))
-
-class BasicBolt(object):
-    def initialize(self, stormconf, context):
-        pass
-
-    def process(self, tuple):
-        pass
-
-    def run(self):
-        global MODE
-        MODE = Bolt
-        global ANCHOR_TUPLE
-        conf, context = initComponent()
-        try:
-            self.initialize(conf, context)
-            while True:
-                tup = readTuple()
-                if tup.is_heartbeat_tuple():
-                    sync()
-                else:
-                    ANCHOR_TUPLE = tup
-                    try:
-                        self.process(tup)
-                        ack(tup)
-                    except Exception, e:
-                        reportError(traceback.format_exc(e))
-                        fail(tup)
-        except Exception, e:
-            reportError(traceback.format_exc(e))
-
-class Spout(object):
-    def initialize(self, conf, context):
-        pass
-
-    def ack(self, id):
-        pass
-
-    def fail(self, id):
-        pass
-
-    def nextTuple(self):
-        pass
-
-    def run(self):
-        global MODE
-        MODE = Spout
-        conf, context = initComponent()
-        try:
-            self.initialize(conf, context)
-            while True:
-                msg = readCommand()
-                if msg["command"] == "next":
-                    self.nextTuple()
-                if msg["command"] == "ack":
-                    self.ack(msg["id"])
-                if msg["command"] == "fail":
-                    self.fail(msg["id"])
-                sync()
-        except Exception, e:
-            reportError(traceback.format_exc(e))

http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/external/flux/pom.xml
----------------------------------------------------------------------
diff --git a/external/flux/pom.xml b/external/flux/pom.xml
index 5ea1b40..bf975cb 100644
--- a/external/flux/pom.xml
+++ b/external/flux/pom.xml
@@ -18,23 +18,17 @@
 <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
     <modelVersion>4.0.0</modelVersion>
 
-    <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux</artifactId>
-    <version>0.3.1-SNAPSHOT</version>
     <packaging>pom</packaging>
     <name>flux</name>
-    <url>https://github.com/ptgoetz/flux</url>
+
 
     <parent>
-        <groupId>org.sonatype.oss</groupId>
-        <artifactId>oss-parent</artifactId>
-        <version>7</version>
+        <artifactId>storm</artifactId>
+        <groupId>org.apache.storm</groupId>
+        <version>0.11.0-SNAPSHOT</version>
+        <relativePath>../../pom.xml</relativePath>
     </parent>
-    <scm>
-        <connection>scm:git:git@github.com:ptgoetz/flux.git</connection>
-        <developerConnection>scm:git:git@github.com:ptgoetz/flux.git</developerConnection>
-        <url>:git@github.com:ptgoetz/flux.git</url>
-    </scm>
 
     <developers>
         <developer>
@@ -46,7 +40,6 @@
 
     <properties>
         <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
-        <storm.version>0.9.3</storm.version>
         <!-- see comment below... This fixes an annoyance with intellij -->
         <provided.scope>provided</provided.scope>
     </properties>
@@ -75,7 +68,7 @@
         <dependency>
             <groupId>org.apache.storm</groupId>
             <artifactId>storm-core</artifactId>
-            <version>${storm.version}</version>
+            <version>${project.version}</version>
             <scope>${provided.scope}</scope>
         </dependency>
         <dependency>

http://git-wip-us.apache.org/repos/asf/storm/blob/2094a08e/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index 2e0c898..02417b8 100644
--- a/pom.xml
+++ b/pom.xml
@@ -169,6 +169,7 @@
         <module>external/storm-jdbc</module>
         <module>external/storm-redis</module>
         <module>external/storm-eventhubs</module>
+        <module>external/flux</module>
     </modules>
 
     <scm>


[12/50] [abbrv] storm git commit: use a better name

Posted by pt...@apache.org.
use a better name


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/ae305c72
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/ae305c72
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/ae305c72

Branch: refs/heads/0.10.x-branch
Commit: ae305c722c6252cd33530ebb74e56ef991451aba
Parents: edc5744
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Apr 8 00:00:51 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Apr 8 00:00:51 2015 -0400

----------------------------------------------------------------------
 flux-examples/src/main/resources/kafka_spout.yaml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/ae305c72/flux-examples/src/main/resources/kafka_spout.yaml
----------------------------------------------------------------------
diff --git a/flux-examples/src/main/resources/kafka_spout.yaml b/flux-examples/src/main/resources/kafka_spout.yaml
index 136f62d..8ffddc5 100644
--- a/flux-examples/src/main/resources/kafka_spout.yaml
+++ b/flux-examples/src/main/resources/kafka_spout.yaml
@@ -20,7 +20,7 @@
 
 # topology definition
 # name to be used when submitting
-name: "shell-topology"
+name: "kafka-topology"
 
 # Components
 # Components are analagous to Spring beans. They are meant to be used as constructor,


[32/50] [abbrv] storm git commit: fix typo in README

Posted by pt...@apache.org.
fix typo in README


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/91369ac2
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/91369ac2
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/91369ac2

Branch: refs/heads/0.10.x-branch
Commit: 91369ac2f15d4a2dade5d03ee2d0970d99e4d0bd
Parents: 8c9e6ce
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Fri May 8 15:43:00 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Fri May 8 15:43:00 2015 -0400

----------------------------------------------------------------------
 external/flux/README.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/91369ac2/external/flux/README.md
----------------------------------------------------------------------
diff --git a/external/flux/README.md b/external/flux/README.md
index 0387f3f..2c5127e 100644
--- a/external/flux/README.md
+++ b/external/flux/README.md
@@ -183,7 +183,7 @@ usage: storm jar <my_topology_uber_jar.jar> org.apache.storm.flux.Flux
                               build, validate, and print information about
                               the topology.
  -e,--env-filter              Perform environment variable substitution.
-                              Replace keysidentified with `${ENV-[NAME]}`
+                              Replace keys identified with `${ENV-[NAME]}`
                               will be replaced with the corresponding
                               `NAME` environment value
  -f,--filter <file>           Perform property substitution. Use the


[02/50] [abbrv] storm git commit: [maven-release-plugin] prepare release flux-0.2.2

Posted by pt...@apache.org.
[maven-release-plugin] prepare release flux-0.2.2


Project: http://git-wip-us.apache.org/repos/asf/storm/repo
Commit: http://git-wip-us.apache.org/repos/asf/storm/commit/54f5fb7c
Tree: http://git-wip-us.apache.org/repos/asf/storm/tree/54f5fb7c
Diff: http://git-wip-us.apache.org/repos/asf/storm/diff/54f5fb7c

Branch: refs/heads/0.10.x-branch
Commit: 54f5fb7ccf200ba9901dac9753ff292b27037fad
Parents: 14970de
Author: P. Taylor Goetz <pt...@gmail.com>
Authored: Wed Apr 1 13:27:33 2015 -0400
Committer: P. Taylor Goetz <pt...@gmail.com>
Committed: Wed Apr 1 13:27:33 2015 -0400

----------------------------------------------------------------------
 flux-core/pom.xml     | 2 +-
 flux-examples/pom.xml | 2 +-
 flux-wrappers/pom.xml | 2 +-
 pom.xml               | 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/storm/blob/54f5fb7c/flux-core/pom.xml
----------------------------------------------------------------------
diff --git a/flux-core/pom.xml b/flux-core/pom.xml
index c22be1a..1ea353a 100644
--- a/flux-core/pom.xml
+++ b/flux-core/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.2.2-SNAPSHOT</version>
+        <version>0.2.2</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/54f5fb7c/flux-examples/pom.xml
----------------------------------------------------------------------
diff --git a/flux-examples/pom.xml b/flux-examples/pom.xml
index d7b0edd..ac19e41 100644
--- a/flux-examples/pom.xml
+++ b/flux-examples/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.2.2-SNAPSHOT</version>
+        <version>0.2.2</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/54f5fb7c/flux-wrappers/pom.xml
----------------------------------------------------------------------
diff --git a/flux-wrappers/pom.xml b/flux-wrappers/pom.xml
index d6bfcca..7884af2 100644
--- a/flux-wrappers/pom.xml
+++ b/flux-wrappers/pom.xml
@@ -21,7 +21,7 @@
     <parent>
         <groupId>com.github.ptgoetz</groupId>
         <artifactId>flux</artifactId>
-        <version>0.2.2-SNAPSHOT</version>
+        <version>0.2.2</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 

http://git-wip-us.apache.org/repos/asf/storm/blob/54f5fb7c/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index cf7e4a6..b06103b 100644
--- a/pom.xml
+++ b/pom.xml
@@ -20,7 +20,7 @@
 
     <groupId>com.github.ptgoetz</groupId>
     <artifactId>flux</artifactId>
-    <version>0.2.2-SNAPSHOT</version>
+    <version>0.2.2</version>
     <packaging>pom</packaging>
     <name>flux</name>
     <url>https://github.com/ptgoetz/flux</url>