You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@streams.apache.org by sb...@apache.org on 2016/10/14 16:24:49 UTC
[1/9] incubator-streams-examples git commit: root level example
reference pages
Repository: incubator-streams-examples
Updated Branches:
refs/heads/master 97fca1ac5 -> 34c1a7be2
root level example reference pages
Project: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/commit/e949f585
Tree: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/tree/e949f585
Diff: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/diff/e949f585
Branch: refs/heads/master
Commit: e949f58545aab00e922b3e928d2b080d1ac49553
Parents: 97fca1a
Author: Steve Blackmon @steveblackmon <sb...@apache.org>
Authored: Tue Oct 11 16:24:48 2016 -0500
Committer: Steve Blackmon @steveblackmon <sb...@apache.org>
Committed: Tue Oct 11 16:24:48 2016 -0500
----------------------------------------------------------------------
src/site/markdown/credentials/twitter.md | 22 ++++++++++++++++
src/site/markdown/install/docker.md | 23 +++++++++++++++++
src/site/markdown/install/git.md | 13 ++++++++++
src/site/markdown/install/java.md | 17 ++++++++++++
src/site/markdown/install/maven.md | 20 ++++++++++++++
src/site/markdown/install/sbt.md | 15 +++++++++++
src/site/markdown/services/elasticsearch.md | 33 ++++++++++++++++++++++++
src/site/markdown/services/mongo.md | 33 ++++++++++++++++++++++++
src/site/markdown/services/neo4j.md | 28 ++++++++++++++++++++
src/site/site.xml | 25 +++++++++---------
10 files changed, 217 insertions(+), 12 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/markdown/credentials/twitter.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/credentials/twitter.md b/src/site/markdown/credentials/twitter.md
new file mode 100644
index 0000000..098dabd
--- /dev/null
+++ b/src/site/markdown/credentials/twitter.md
@@ -0,0 +1,22 @@
+## Twitter Credentials
+
+Create a local file `twitter.conf` with valid twitter credentials
+
+ twitter {
+ oauth {
+ consumerKey = ""
+ consumerSecret = ""
+ accessToken = ""
+ accessTokenSecret = ""
+ }
+ }
+
+Log into developer.twitter.com
+
+Visit https://apps.twitter.com and create an application.
+
+Select your application and click into 'Keys and Access Tokens'
+
+The consumerKey and consumerSecret are application-wide.
+
+The accessToken and accessTokenSecret are per-user.
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/markdown/install/docker.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/install/docker.md b/src/site/markdown/install/docker.md
new file mode 100644
index 0000000..539b4df
--- /dev/null
+++ b/src/site/markdown/install/docker.md
@@ -0,0 +1,23 @@
+## Docker
+
+Run from your command line:
+
+ docker version
+
+| Possible result | Explanation |
+|-----------------|-------------|
+| bash: docker: No such file or directory | You need to install docker |
+| Client: Version: < 1.0.0 | You need a newer version of docker |
+| Server: Version: < 1.0.0 | You need a newer version of docker |
+| Client: Version: > 1.0.0\nServer: Version: > 1.0.0 | You are all good |
+
+If you need to install docker, start here:
+
+[https://docs.docker.com/engine/installation/](https://docs.docker.com/engine/installation/)
+
+Run from your command line:
+
+ docker ps
+
+If you see a (possibly empty) list of running containers, you are good.
+
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/markdown/install/git.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/install/git.md b/src/site/markdown/install/git.md
new file mode 100644
index 0000000..42a1d2e
--- /dev/null
+++ b/src/site/markdown/install/git.md
@@ -0,0 +1,13 @@
+#### Git
+
+ git -version
+
+| Possible result | Explanation |
+|-----------------|-------------|
+| bash: git: No such file or directory | You need to install git |
+| git version < 2.7 | You should upgrade git for security reasons |
+| git version > 2.7 | You are all good |
+
+If you need to install docker, start here:
+
+[https://git-scm.com/book/en/v2/Getting-Started-Installing-Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/markdown/install/java.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/install/java.md b/src/site/markdown/install/java.md
new file mode 100644
index 0000000..05c79fd
--- /dev/null
+++ b/src/site/markdown/install/java.md
@@ -0,0 +1,17 @@
+#### Java SDK
+
+Run from your command line:
+
+ java -version
+
+| Possible result | Explanation |
+|-----------------|-------------|
+| Java Version >= 1.7.0u72) | You're all good |
+| Java Version >= 1.8.0u25) | You're all good |
+| Java Version < 1.7.0u72 | You need a newer JDK |
+| Java Version < 1.8.0u25 | You need a newer JDK |
+
+If you need to install or upgrade Java, start here:
+
+[http://www.oracle.com/technetwork/java/javase/downloads/index.html](http://www.oracle.com/technetwork/java/javase/downloads/index.html)
+
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/markdown/install/maven.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/install/maven.md b/src/site/markdown/install/maven.md
new file mode 100644
index 0000000..de620f2
--- /dev/null
+++ b/src/site/markdown/install/maven.md
@@ -0,0 +1,20 @@
+#### Maven
+
+Run from your command line:
+
+ mvn -version
+
+| Possible result | Explanation |
+|-----------------|-------------|
+| -bash: mvn: command not found | You need to install maven |
+| Error: JAVA_HOME is not defined correctly. | You need to install JDK |
+| Apache Maven >= 3.2.5+\nJava Version >= 1.7.0u72) | You're all good |
+| Apache Maven >= 3.2.5+\nJava Version >= 1.8.0u25) | You're all good |
+| Apache Maven < 3.2.5 | You need a newer version of maven |
+| Java Version < 1.7.0u72 | You need a newer version of maven |
+| Java Version < 1.8.0u25 | You need a newer JDK |
+
+If you need to install maven, start here:
+
+[http://maven.apache.org/install.html](http://maven.apache.org/install.html)
+
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/markdown/install/sbt.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/install/sbt.md b/src/site/markdown/install/sbt.md
new file mode 100644
index 0000000..8fa3d88
--- /dev/null
+++ b/src/site/markdown/install/sbt.md
@@ -0,0 +1,15 @@
+## SBT
+
+Run from your command line:
+
+ sbt
+
+| Possible result | Explanation |
+|-----------------|-------------|
+| bash: sbt: command not found | You need to install sbt |
+
+You should really install sbt-extras, like this:
+
+ curl -s https://raw.githubusercontent.com/paulp/sbt-extras/master/sbt > /usr/bin/sbtx && chmod 0755 /usr/bin/sbtx
+
+Now you can easily run the streams examples using SBT, like a boss.
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/markdown/services/elasticsearch.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/services/elasticsearch.md b/src/site/markdown/services/elasticsearch.md
new file mode 100644
index 0000000..14454a7
--- /dev/null
+++ b/src/site/markdown/services/elasticsearch.md
@@ -0,0 +1,33 @@
+## Elasticsearch
+
+Start elasticsearch via docker with the docker maven plugin:
+
+ docker -PdockerITs docker:start
+
+Confirm that elasticsearch is running:
+
+ docker ps
+
+Confirm that host and post(s) are in property file:
+
+ cat elasticsearch.properties
+
+Create a local file `elasticsearch.conf` with cluster details:
+
+ elasticsearch {
+ hosts += ${es.tcp.host}
+ port = ${es.tcp.port}
+ clusterName = "elasticsearch"
+ }
+
+When configuring a stream, include these files:
+
+ include "elasticsearch.properties"
+ include "elasticsearch.conf"
+
+Supply application-specific configuration as well:
+
+ elasticsearch {
+ index: ""
+ type: ""
+ }
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/markdown/services/mongo.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/services/mongo.md b/src/site/markdown/services/mongo.md
new file mode 100644
index 0000000..002c12d
--- /dev/null
+++ b/src/site/markdown/services/mongo.md
@@ -0,0 +1,33 @@
+## Mongo
+
+Start mongo via docker with the docker maven plugin:
+
+ docker -PdockerITs docker:start
+
+Confirm that elasticsearch is running:
+
+ docker ps
+
+Confirm that host and post(s) are in property file:
+
+ cat mongo.properties
+
+Create a local file `elasticsearch.conf` with cluster details:
+
+ mongo {
+ host = ${mongo.tcp.host}
+ port = ${mongo.tcp.port}
+ }
+
+When configuring a stream, include these files:
+
+ include "mongo.properties"
+ include "mongo.conf"
+
+Supply application-specific configuration as well:
+
+ mongo {
+ db: "",
+ collection: ""
+ }
+
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/markdown/services/neo4j.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/services/neo4j.md b/src/site/markdown/services/neo4j.md
new file mode 100644
index 0000000..9d5895f
--- /dev/null
+++ b/src/site/markdown/services/neo4j.md
@@ -0,0 +1,28 @@
+## Neo4j
+
+Start mongo via docker with the docker maven plugin:
+
+ docker -PdockerITs docker:start
+
+Confirm that elasticsearch is running:
+
+ docker ps
+
+Confirm that host and post(s) are in property file:
+
+ cat neo4j.properties
+
+Create a local file `neo4j.conf` with cluster details:
+
+ neo4j {
+ hostname = ${neo4j.tcp.host}
+ port = ${neo4j.tcp.port}
+ type = "neo4j"
+ graph = "data"
+ }
+
+When configuring a stream, include these files:
+
+ include "neo4j.properties"
+ include "neo4j.conf"
+
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/e949f585/src/site/site.xml
----------------------------------------------------------------------
diff --git a/src/site/site.xml b/src/site/site.xml
index 5cde002..b85609c 100644
--- a/src/site/site.xml
+++ b/src/site/site.xml
@@ -19,19 +19,20 @@
<project>
<custom>
<fluidoSkin>
- <topBarEnabled>true</topBarEnabled>
+ <topBarEnabled>false</topBarEnabled>
<navBarStyle>navbar-inverse</navBarStyle>
- <sideBarEnabled>false</sideBarEnabled>
- <gitHub>
- <projectId>apache/incubator-streams-examples</projectId>
- <ribbonOrientation>right</ribbonOrientation>
- <ribbonColor>black</ribbonColor>
- </gitHub>
- <twitter>
- <user>ApacheStreams</user>
- <showUser>true</showUser>
- <showFollowers>true</showFollowers>
- </twitter>
+ <sideBarEnabled>true</sideBarEnabled>
</fluidoSkin>
</custom>
+ <body>
+ <menu name="Overview" />
+ <menu name="Projects" />
+ <menu name="Resources" inherit="bottom" >
+ <item name="Install Git" href="install/git.html"/>
+ <item name="Install Java" href="install/java.html"/>
+ <item name="Install Maven" href="install/maven.html"/>
+ <item name="Install Docker" href="install/docker.html"/>
+ <item name="Install SBT" href="install/sbt.html"/>
+ </menu>
+ </body>
</project>
\ No newline at end of file
[5/9] incubator-streams-examples git commit: seperate stream
markdowns from module markdowns
Posted by sb...@apache.org.
seperate stream markdowns from module markdowns
Project: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/commit/bed4b0f0
Tree: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/tree/bed4b0f0
Diff: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/diff/bed4b0f0
Branch: refs/heads/master
Commit: bed4b0f0db1f4b34ed9f39f78957f85ce5cabcab
Parents: 5b96588
Author: Steve Blackmon @steveblackmon <sb...@apache.org>
Authored: Tue Oct 11 16:39:55 2016 -0500
Committer: Steve Blackmon @steveblackmon <sb...@apache.org>
Committed: Tue Oct 11 16:39:55 2016 -0500
----------------------------------------------------------------------
.../src/site/markdown/ElasticsearchHdfs.md | 38 +++++------
.../src/site/markdown/HdfsElasticsearch.md | 38 +++++------
.../src/site/markdown/index.md | 26 ++------
.../src/site/markdown/ElasticsearchReindex.md | 33 ++++++++++
.../src/site/markdown/index.md | 49 +++-----------
.../src/site/markdown/MongoElasticsearchSync.md | 32 +++++++++
.../src/site/markdown/index.md | 69 +++-----------------
.../markdown/TwitterHistoryElasticsearch.md | 38 +++++++++++
.../src/site/markdown/index.md | 56 +++-------------
.../markdown/TwitterUserstreamElasticsearch.md | 32 +++++++++
.../src/site/markdown/index.md | 53 ++-------------
11 files changed, 209 insertions(+), 255 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md b/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md
index 6db4329..ad8ad4a 100644
--- a/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md
+++ b/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md
@@ -1,36 +1,32 @@
-elasticsearch-hdfs
-==============================
+### ElasticsearchHdfs
-Description:
------------------
+#### Description:
Copies documents from elasticsearch to hdfs.
-Specification:
------------------
+#### Configuration:
-[ElasticsearchHdfs.dot](ElasticsearchHdfs.dot "ElasticsearchHdfs.dot" )
-
-Diagram:
------------------
+[ElasticsearchHdfsIT.conf](ElasticsearchHdfsIT.conf "ElasticsearchHdfsIT.conf" )
-![ElasticsearchHdfs.dot.svg](./ElasticsearchHdfs.dot.svg)
+#### Run (SBT):
-Example Configuration:
-----------------------
+ sbtx -210 -sbt-create
+ set resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
+ set libraryDependencies += "org.apache.streams" % "elasticsearch-hdfs" % "0.4-incubating-SNAPSHOT"
+ set fork := true
+ set javaOptions +="-Dconfig.file=application.conf"
+ run elasticsearch-hdfs org.apache.streams.example.ElasticsearchHdfs
-[testBackup.json](testBackup.json "testBackup.json" )
+#### Run (Docker):
-Run (Local):
-------------
+ docker run apachestreams/elasticsearch-hdfs java -cp elasticsearch-hdfs-jar-with-dependencies.jar -Dconfig.url=http://<location_of_config_file> org.apache.streams.example.ElasticsearchHdfs
- java -cp dist/elasticsearch-hdfs-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.elasticsearch.example.ElasticsearchHdfs
+#### Specification:
-Run (Docker):
--------------
+[ElasticsearchHdfs.dot](ElasticsearchHdfs.dot "ElasticsearchHdfs.dot" )
- docker run elasticsearch-hdfs java -cp elasticsearch-hdfs-jar-with-dependencies.jar -Dconfig.url=http://<location_of_config_file> org.apache.streams.elasticsearch.example.ElasticsearchHdfs
+#### Diagram:
-[JavaDocs](apidocs/index.html "JavaDocs")
+![ElasticsearchHdfs.dot.svg](./ElasticsearchHdfs.dot.svg)
###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md b/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md
index 2f90e44..136b110 100644
--- a/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md
+++ b/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md
@@ -1,36 +1,32 @@
-hdfs-elasticsearch
-==============================
+### HdfsElasticsearch
-Description:
------------------
+#### Description:
Copies documents from hdfs to elasticsearch.
-Specification:
------------------
+#### Configuration:
-[HdfsElasticsearch.dot](HdfsElasticsearch.dot "HdfsElasticsearch.dot" )
-
-Diagram:
------------------
+[HdfsElasticsearchIT.conf](HdfsElasticsearchIT.conf "HdfsElasticsearchIT.conf" )
-![HdfsElasticsearch.dot.svg](./HdfsElasticsearch.dot.svg)
+#### Run (SBT):
-Example Configuration:
-----------------------
+ sbtx -210 -sbt-create
+ set resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
+ set libraryDependencies += "org.apache.streams" % "elasticsearch-hdfs" % "0.4-incubating-SNAPSHOT"
+ set fork := true
+ set javaOptions +="-Dconfig.file=HdfsElasticsearchIT.conf"
+ run elasticsearch-hdfs org.apache.streams.example.ElasticsearchHdfs
-[testRestore.json](testRestore.json "testRestore.json" )
+#### Run (Docker):
-Run (Local):
-------------
+ docker run elasticsearch-hdfs java -cp elasticsearch-hdfs-jar-with-dependencies.jar -Dconfig.file=`pwd`/HdfsElasticsearchIT.conf org.apache.streams.example.HdfsElasticsearch
- java -cp dist/elasticsearch-hdfs-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.elasticsearch.example.HdfsElasticsearch
+#### Specification:
-Run (Docker):
--------------
+[HdfsElasticsearch.dot](HdfsElasticsearch.dot "HdfsElasticsearch.dot" )
- docker run elasticsearch-hdfs java -cp elasticsearch-hdfs-jar-with-dependencies.jar -Dconfig.url=http://<location_of_config_file> org.apache.streams.elasticsearch.example.HdfsElasticsearch
+#### Diagram:
-[JavaDocs](apidocs/index.html "JavaDocs")
+![HdfsElasticsearch.dot.svg](./HdfsElasticsearch.dot.svg)
###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/elasticsearch-hdfs/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/site/markdown/index.md b/local/elasticsearch-hdfs/src/site/markdown/index.md
index b014a19..d789a2f 100644
--- a/local/elasticsearch-hdfs/src/site/markdown/index.md
+++ b/local/elasticsearch-hdfs/src/site/markdown/index.md
@@ -1,28 +1,19 @@
-elasticsearch-hdfs
-==============================
+### elasticsearch-hdfs
-Requirements:
--------------
- - A running ElasticSearch 1.0.0+ instance
+#### Requirements:
+ - A running ElasticSearch 2.0.0+ instance
-Description:
-------------
-Copies documents between elasticsearch and file system using the hdfs persist module.
-
-Streams:
---------
+#### Streams:
<a href="HdfsElasticsearch.html" target="_self">HdfsElasticsearch</a>
<a href="ElasticsearchHdfs.html" target="_self">ElasticsearchHdfs</a>
-Build:
----------
+#### Build:
mvn clean install
-Testing:
----------
+#### Test:
Start up elasticsearch with docker:
@@ -36,11 +27,6 @@ Shutdown elasticsearch when finished:
mvn -PdockerITs docker:stop
-Deploy (Docker):
-----------------
-
- mvn -Pdocker -Ddocker.repo=<your docker host>:<your docker repo> clean package docker:build docker:push
-
[JavaDocs](apidocs/index.html "JavaDocs")
###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/elasticsearch-reindex/src/site/markdown/ElasticsearchReindex.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/site/markdown/ElasticsearchReindex.md b/local/elasticsearch-reindex/src/site/markdown/ElasticsearchReindex.md
new file mode 100644
index 0000000..2a2a6b2
--- /dev/null
+++ b/local/elasticsearch-reindex/src/site/markdown/ElasticsearchReindex.md
@@ -0,0 +1,33 @@
+### ElasticsearchReindex
+
+#### Description:
+
+Copies documents into a different index
+
+#### Configuration:
+
+[ElasticsearchReindexIT.conf](ElasticsearchReindexIT.conf "ElasticsearchReindexIT.conf" )
+
+#### Run (SBT):
+
+ sbtx -210 -sbt-create
+ set resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
+ set libraryDependencies += "org.apache.streams" % "elasticsearch-reindex" % "0.4-incubating-SNAPSHOT"
+ set fork := true
+ set javaOptions +="-Dconfig.file=ElasticsearchReindexIT.conf"
+ run elasticsearch-hdfs org.apache.streams.example.ElasticsearchReindex
+
+#### Run (Docker):
+
+ docker run elasticsearch-reindex java -cp elasticsearch-reindex-jar-with-dependencies.jar -Dconfig.file=`pwd`/HdfsElasticsearchIT.conf org.apache.streams.example.ElasticsearchReindex
+
+#### Specification:
+
+[ElasticsearchReindex.dot](ElasticsearchReindex.dot "ElasticsearchReindex.dot" )
+
+#### Diagram:
+
+![ElasticsearchReindex.dot.svg](./ElasticsearchReindex.dot.svg)
+
+
+###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/elasticsearch-reindex/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/site/markdown/index.md b/local/elasticsearch-reindex/src/site/markdown/index.md
index 8aa394d..87c3e04 100644
--- a/local/elasticsearch-reindex/src/site/markdown/index.md
+++ b/local/elasticsearch-reindex/src/site/markdown/index.md
@@ -1,35 +1,19 @@
-elasticsearch-reindex
-==============================
+### elasticsearch-reindex
-Requirements:
--------------
- - A running ElasticSearch 1.0.0+ cluster
+#### Requirements:
+ - A running ElasticSearch 2.0.0+ cluster
- Transport client access to cluster
- elasticsearch.version and lucene.version set to match cluster
-Description:
-------------
-Copies documents into a different index
+#### Streams:
-Specification:
------------------
+<a href="ElasticsearchReindex.html" target="_self">ElasticsearchReindex</a>
-[ElasticsearchReindex.dot](ElasticsearchReindex.dot "ElasticsearchReindex.dot" )
+#### Build:
-Diagram:
------------------
+ mvn clean install
-![ElasticsearchReindex.dot.svg](./ElasticsearchReindex.dot.svg)
-
-Example Configuration:
-----------------------
-
-[testReindex.json](testReindex.json "testReindex.json" )
-
-Populate source and destination in configuration with cluster / index / type details.
-
-Testing:
----------
+#### Testing:
Start up elasticsearch with docker:
@@ -37,27 +21,12 @@ Start up elasticsearch with docker:
Build with integration testing enabled, using your credentials
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=`pwd`/application.conf"
+ mvn clean test verify -DskipITs=false
Shutdown elasticsearch when finished:
mvn -PdockerITs docker:stop
-Run (Local):
-------------
-
- java -cp dist/elasticsearch-reindex-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.elasticsearch.example.ElasticsearchReindex
-
-Deploy (Docker):
-----------------
-
- mvn -Pdocker -Ddocker.repo=<your docker host>:<your docker repo> docker:build docker:push
-
-Run (Docker):
--------------
-
- docker run elasticsearch-reindex java -cp elasticsearch-reindex-jar-with-dependencies.jar -Dconfig.url=http://<location_of_config_file> org.apache.streams.elasticsearch.example.ElasticsearchReindex
-
[JavaDocs](apidocs/index.html "JavaDocs")
###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/mongo-elasticsearch-sync/src/site/markdown/MongoElasticsearchSync.md
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/site/markdown/MongoElasticsearchSync.md b/local/mongo-elasticsearch-sync/src/site/markdown/MongoElasticsearchSync.md
new file mode 100644
index 0000000..cdbdce1
--- /dev/null
+++ b/local/mongo-elasticsearch-sync/src/site/markdown/MongoElasticsearchSync.md
@@ -0,0 +1,32 @@
+### MongoElasticsearchSync
+
+#### Description:
+
+Copies documents from mongodb to elasticsearch
+
+#### Configuration:
+
+[MongoElasticsearchSyncIT.conf](MongoElasticsearchSyncIT.conf "MongoElasticsearchSyncIT.conf" )
+
+#### Run (SBT):
+
+ sbtx -210 -sbt-create
+ set resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
+ set libraryDependencies += "org.apache.streams" % "mongo-elasticsearch-sync" % "0.4-incubating-SNAPSHOT"
+ set fork := true
+ set javaOptions +="-Dconfig.file=application.conf"
+ run mongo-elasticsearch-sync org.apache.streams.example.MongoElasticsearchSync
+
+#### Run (Docker):
+
+ docker run apachestreams/mongo-elasticsearch-sync java -cp mongo-elasticsearch-sync-jar-with-dependencies.jar org.apache.streams.example.MongoElasticsearchSync
+
+#### Specification:
+
+[MongoElasticsearchSync.dot](MongoElasticsearchSync.dot "MongoElasticsearchSync.dot" )
+
+#### Diagram:
+
+![MongoElasticsearchSync.dot.svg](./MongoElasticsearchSync.dot.svg)
+
+###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/mongo-elasticsearch-sync/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/site/markdown/index.md b/local/mongo-elasticsearch-sync/src/site/markdown/index.md
index 42512b6..526375b 100644
--- a/local/mongo-elasticsearch-sync/src/site/markdown/index.md
+++ b/local/mongo-elasticsearch-sync/src/site/markdown/index.md
@@ -1,80 +1,31 @@
-Apache Streams (incubating)
-Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
---------------------------------------------------------------------------------
+### mongo-elasticsearch-sync
-mongo-elasticsearch-sync
-==============================
-
-Requirements:
--------------
+#### Requirements:
- A running MongoDB 2.4+ instance
- - A running ElasticSearch 1.0.0+ instance
-
-Description:
-------------
-Copies documents from mongodb to elasticsearch
-
-Specification:
------------------
-
-[MongoElasticsearchSync.dot](MongoElasticsearchSync.dot "MongoElasticsearchSync.dot" )
+ - A running ElasticSearch 2.0+ instance
-Diagram:
------------------
+#### Streams:
-![MongoElasticsearchSync.dot.svg](./MongoElasticsearchSync.dot.svg)
+<a href="MongoElasticsearchSync.html" target="_self">MongoElasticsearchSync</a>
-Example Configuration:
-----------------------
-
-[testSync.json](testSync.json "testSync.json" )
-
-Build:
----------
+#### Build:
mvn clean package
-Testing:
----------
-
-Create a local file `application.conf` with valid twitter credentials
-
- twitter {
- oauth {
- consumerKey = ""
- consumerSecret = ""
- accessToken = ""
- accessTokenSecret = ""
- }
- }
+#### Test:
Start up elasticsearch and mongodb with docker:
- mvn -PdockerITs docker:start
+ mvn -PdockerITs docker:start
-Build with integration testing enabled, using your credentials
+Build with integration testing enabled:
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=`pwd`/application.conf"
+ mvn clean test verify -DskipITs=false
Shutdown elasticsearch and mongodb when finished:
mvn -PdockerITs docker:stop
-Run (Local):
-------------
-
- java -cp dist/mongo-elasticsearch-sync-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.example.elasticsearch.MongoElasticsearchSync
-
-Deploy (Docker):
-----------------
-
- mvn -Pdocker -Ddocker.repo=<your docker host>:<your docker repo> docker:build docker:push
-
-Run (Docker):
--------------
-
- docker run mongo-elasticsearch-sync java -cp mongo-elasticsearch-sync-jar-with-dependencies.jar -Dconfig.url=http://<location_of_config_file> org.apache.streams.elasticsearch.example.MongoElasticsearchSync
-
[JavaDocs](apidocs/index.html "JavaDocs")
###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/twitter-history-elasticsearch/src/site/markdown/TwitterHistoryElasticsearch.md
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/site/markdown/TwitterHistoryElasticsearch.md b/local/twitter-history-elasticsearch/src/site/markdown/TwitterHistoryElasticsearch.md
new file mode 100644
index 0000000..9b696c2
--- /dev/null
+++ b/local/twitter-history-elasticsearch/src/site/markdown/TwitterHistoryElasticsearch.md
@@ -0,0 +1,38 @@
+### TwitterHistoryElasticsearch
+
+#### Description:
+
+Retrieves as many posts from a known list of users as twitter API allows.
+
+Converts them to activities, and writes them in activity format to Elasticsearch.
+
+#### Configuration:
+
+[TwitterHistoryElasticsearchIT.conf](TwitterHistoryElasticsearchIT.conf "TwitterHistoryElasticsearchIT.conf" )
+
+In the Twitter section you should place all of your relevant authentication keys and whichever Twitter IDs you want to pull history for.
+
+Twitter IDs can be converted from screennames at http://www.gettwitterid.com
+
+#### Run (SBT):
+
+ sbtx -210 -sbt-create
+ set resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
+ set libraryDependencies += "org.apache.streams" % "twitter-history-elasticsearch" % "0.4-incubating-SNAPSHOT"
+ set fork := true
+ set javaOptions +="-Dconfig.file=application.conf"
+ run org.apache.streams.example.TwitterHistoryElasticsearch
+
+#### Run (Docker):
+
+ docker run apachestreams/twitter-history-elasticsearch java -cp twitter-history-elasticsearch-jar-with-dependencies.jar -Dconfig.file=`pwd`/application.conf org.apache.streams.example.TwitterHistoryElasticsearch
+
+#### Specification:
+
+[TwitterHistoryElasticsearch.dot](TwitterHistoryElasticsearch.dot "TwitterHistoryElasticsearch.dot" )
+
+#### Diagram:
+
+![TwitterHistoryElasticsearch.dot.svg](./TwitterHistoryElasticsearch.dot.svg)
+
+###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/twitter-history-elasticsearch/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/site/markdown/index.md b/local/twitter-history-elasticsearch/src/site/markdown/index.md
index 22eee99..e737a12 100644
--- a/local/twitter-history-elasticsearch/src/site/markdown/index.md
+++ b/local/twitter-history-elasticsearch/src/site/markdown/index.md
@@ -1,43 +1,18 @@
-twitter-history-elasticsearch
-==============================
+### twitter-history-elasticsearch
-Requirements:
--------------
+#### Requirements:
- Authorized Twitter API credentials
- - A running ElasticSearch 1.0.0+ instance
+ - A running ElasticSearch 2.0.0+ instance
-Description:
-------------
-Retrieves as many posts from a known list of users as twitter API allows.
+#### Streams:
-Converts them to activities, and writes them in activity format to Elasticsearch.
+<a href="TwitterHistoryElasticsearch.html" target="_self">TwitterHistoryElasticsearch</a>
-Specification:
------------------
-
-[TwitterHistoryElasticsearch.dot](TwitterHistoryElasticsearch.dot "TwitterHistoryElasticsearch.dot" )
-
-Diagram:
------------------
-
-![TwitterHistoryElasticsearch.dot.svg](./TwitterHistoryElasticsearch.dot.svg)
-
-Example Configuration:
-----------------------
-
-[application.conf](application.conf "application.conf" )
-
-In the Twitter section you should place all of your relevant authentication keys and whichever Twitter IDs you want to pull history for.
-
-Twitter IDs can be converted from screennames at http://www.gettwitterid.com
-
-Build:
----------
+#### Build:
mvn clean package
-Testing:
----------
+#### Test:
Create a local file `application.conf` with valid twitter credentials
@@ -52,7 +27,7 @@ Create a local file `application.conf` with valid twitter credentials
Start up elasticsearch with docker:
- mvn -PdockerITs docker:start
+ mvn -PdockerITs docker:start
Build with integration testing enabled, using your credentials
@@ -62,21 +37,6 @@ Shutdown elasticsearch when finished:
mvn -PdockerITs docker:stop
-Run (Local):
-------------
-
- java -cp dist/twitter-history-elasticsearch-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.example.twitter.TwitterHistoryElasticsearch
-
-Deploy (Docker):
-----------------
-
- mvn -Pdocker -Ddocker.repo=<your docker host>:<your docker repo> docker:build docker:push
-
-Run (Docker):
--------------
-
- docker run twitter-history-elasticsearch java -cp twitter-history-elasticsearch-jar-with-dependencies.jar -Dconfig.url=http://<location_of_config_file> org.apache.streams.example.twitter.TwitterHistoryElasticsearch
-
[JavaDocs](apidocs/index.html "JavaDocs")
###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/twitter-userstream-elasticsearch/src/site/markdown/TwitterUserstreamElasticsearch.md
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/site/markdown/TwitterUserstreamElasticsearch.md b/local/twitter-userstream-elasticsearch/src/site/markdown/TwitterUserstreamElasticsearch.md
new file mode 100644
index 0000000..36f4244
--- /dev/null
+++ b/local/twitter-userstream-elasticsearch/src/site/markdown/TwitterUserstreamElasticsearch.md
@@ -0,0 +1,32 @@
+### TwitterUserstreamElasticsearch
+
+#### Description:
+
+This example connects to an active twitter account and stores the userstream as activities in Elasticsearch
+
+#### Configuration:
+
+[TwitterUserstreamElasticsearchIT.conf](TwitterUserstreamElasticsearchIT.conf "TwitterUserstreamElasticsearchIT.conf" )
+
+#### Run (SBT):
+
+ sbtx -210 -sbt-create
+ set resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
+ set libraryDependencies += "org.apache.streams" % "twitter-userstream-elasticsearch" % "0.4-incubating-SNAPSHOT"
+ set fork := true
+ set javaOptions +="-Dconfig.file=application.conf"
+ run org.apache.streams.example.TwitterUserstreamElasticsearch
+
+#### Run (Docker):
+
+ docker run apachestreams/twitter-userstream-elasticsearch java -cp twitter-userstream-elasticsearch-jar-with-dependencies.jar -Dconfig.file=`pwd`/application.conf org.apache.streams.example.TwitterUserstreamElasticsearch
+
+#### Specification:
+
+[TwitterUserstreamElasticsearch.dot](TwitterUserstreamElasticsearch.dot "TwitterUserstreamElasticsearch.dot" )
+
+#### Diagram:
+
+![TwitterUserstreamElasticsearch.dot.svg](./TwitterUserstreamElasticsearch.dot.svg)
+
+###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/bed4b0f0/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/site/markdown/index.md b/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
index f5379c9..833efde 100644
--- a/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
+++ b/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
@@ -1,42 +1,18 @@
-twitter-userstream-elasticsearch
-==============================
+### twitter-userstream-elasticsearch
-Requirements:
--------------
+#### Requirements:
- Authorized Twitter API credentials
- A running ElasticSearch 1.0.0+ instance
-Description:
-------------
-This example connects to an active twitter account and stores the userstream as activities in Elasticsearch
+#### Streams:
-Specification:
------------------
+<a href="TwitterUserstreamElasticsearch.html" target="_self">TwitterUserstreamElasticsearch</a>
-[TwitterUserstreamElasticsearch.dot](TwitterUserstreamElasticsearch.dot "TwitterUserstreamElasticsearch.dot" )
-
-Diagram:
------------------
-
-![TwitterUserstreamElasticsearch.dot.svg](./TwitterUserstreamElasticsearch.dot.svg)
-
-Example Configuration:
-----------------------
-
-[application.conf](application.conf "application.conf" )
-
-The consumerKey and consumerSecret are set for our streams-example application
-The accessToken and accessTokenSecret can be obtained by navigating to:
-
- https://api.twitter.com/oauth/authenticate?oauth_token=UIJ0AUxCJatpKDUyFt0OTSEP4asZgqxRwUCT0AMSwc&oauth_callback=http%3A%2F%2Foauth.streamstutorial.w2odata.com%3A8080%2Fsocialauthdemo%2FsocialAuthSuccessAction.do
-
-Build:
----------
+#### Build:
mvn clean package
-Testing:
----------
+#### Test:
Create a local file `application.conf` with valid twitter credentials
@@ -51,7 +27,7 @@ Create a local file `application.conf` with valid twitter credentials
Start up elasticsearch with docker:
- mvn -PdockerITs docker:start
+ mvn -PdockerITs docker:start
Build with integration testing enabled, using your credentials
@@ -61,21 +37,6 @@ Shutdown elasticsearch when finished:
mvn -PdockerITs docker:stop
-Run (Local):
-------------
-
- java -cp dist/twitter-userstream-elasticsearch-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.example.twitter.TwitterUserstreamElasticsearch
-
-Deploy (Docker):
-----------------
-
- mvn -Pdocker -Ddocker.repo=<your docker host>:<your docker repo> docker:build docker:push
-
-Run (Docker):
--------------
-
- docker run twitter-userstream-elasticsearch java -cp twitter-userstream-elasticsearch-jar-with-dependencies.jar -Dconfig.url=http://<location_of_config_file> org.apache.streams.example.twitter.TwitterUserstreamElasticsearch
-
[JavaDocs](apidocs/index.html "JavaDocs")
###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
[6/9] incubator-streams-examples git commit: add left side-bars on
example pages with contextual help references
Posted by sb...@apache.org.
add left side-bars on example pages with contextual help references
Project: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/commit/3c1fbdee
Tree: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/tree/3c1fbdee
Diff: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/diff/3c1fbdee
Branch: refs/heads/master
Commit: 3c1fbdee2af9da666b0f1744ce04f714bc14d2c3
Parents: bed4b0f
Author: Steve Blackmon @steveblackmon <sb...@apache.org>
Authored: Tue Oct 11 16:41:04 2016 -0500
Committer: Steve Blackmon @steveblackmon <sb...@apache.org>
Committed: Tue Oct 11 16:41:04 2016 -0500
----------------------------------------------------------------------
.../flink-twitter-collection/src/site/site.xml | 28 ++++++++++++
flink/src/site/markdown/flink.md | 11 +++++
flink/src/site/site.xml | 25 +++++++++++
local/elasticsearch-hdfs/src/site/site.xml | 25 +++++++++++
local/elasticsearch-reindex/src/site/site.xml | 25 +++++++++++
.../mongo-elasticsearch-sync/src/site/site.xml | 33 ++++++++++++++
local/src/site/site.xml | 27 ++++++++++++
local/twitter-follow-neo4j/src/site/site.xml | 23 ++--------
.../src/site/site.xml | 45 ++++++++++++++++++++
.../src/site/site.xml | 45 ++++++++++++++++++++
10 files changed, 267 insertions(+), 20 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/flink/flink-twitter-collection/src/site/site.xml
----------------------------------------------------------------------
diff --git a/flink/flink-twitter-collection/src/site/site.xml b/flink/flink-twitter-collection/src/site/site.xml
new file mode 100644
index 0000000..f801659
--- /dev/null
+++ b/flink/flink-twitter-collection/src/site/site.xml
@@ -0,0 +1,28 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one
+ ~ or more contributor license agreements. See the NOTICE file
+ ~ distributed with this work for additional information
+ ~ regarding copyright ownership. The ASF licenses this file
+ ~ to you under the Apache License, Version 2.0 (the
+ ~ "License"); you may not use this file except in compliance
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing,
+ ~ software distributed under the License is distributed on an
+ ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ ~ KIND, either express or implied. See the License for the
+ ~ specific language governing permissions and limitations
+ ~ under the License.
+ -->
+<project>
+ <body>
+ <menu name="Credentials">
+ <item name="Twitter" href="../../credentials/twitter.html"/>
+ </menu>
+ <menu name="Runtime">
+ <item name="Flink" href="../flink.html"/>
+ </menu>
+ </body>
+</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/flink/src/site/markdown/flink.md
----------------------------------------------------------------------
diff --git a/flink/src/site/markdown/flink.md b/flink/src/site/markdown/flink.md
new file mode 100644
index 0000000..ed96496
--- /dev/null
+++ b/flink/src/site/markdown/flink.md
@@ -0,0 +1,11 @@
+## Flink
+
+Create a local file `flink.conf`
+
+ local = true
+ test = true
+
+When configuring a stream, include this files:
+
+ include "flink.conf"
+
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/flink/src/site/site.xml
----------------------------------------------------------------------
diff --git a/flink/src/site/site.xml b/flink/src/site/site.xml
new file mode 100644
index 0000000..382b3f2
--- /dev/null
+++ b/flink/src/site/site.xml
@@ -0,0 +1,25 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one
+ ~ or more contributor license agreements. See the NOTICE file
+ ~ distributed with this work for additional information
+ ~ regarding copyright ownership. The ASF licenses this file
+ ~ to you under the Apache License, Version 2.0 (the
+ ~ "License"); you may not use this file except in compliance
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing,
+ ~ software distributed under the License is distributed on an
+ ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ ~ KIND, either express or implied. See the License for the
+ ~ specific language governing permissions and limitations
+ ~ under the License.
+ -->
+<project>
+ <body>
+ <menu name="Runtime">
+ <item name="Flink" href="../flink.html"/>
+ </menu>
+ </body>
+</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/local/elasticsearch-hdfs/src/site/site.xml
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/site/site.xml b/local/elasticsearch-hdfs/src/site/site.xml
new file mode 100644
index 0000000..2520343
--- /dev/null
+++ b/local/elasticsearch-hdfs/src/site/site.xml
@@ -0,0 +1,25 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one
+ ~ or more contributor license agreements. See the NOTICE file
+ ~ distributed with this work for additional information
+ ~ regarding copyright ownership. The ASF licenses this file
+ ~ to you under the Apache License, Version 2.0 (the
+ ~ "License"); you may not use this file except in compliance
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing,
+ ~ software distributed under the License is distributed on an
+ ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ ~ KIND, either express or implied. See the License for the
+ ~ specific language governing permissions and limitations
+ ~ under the License.
+ -->
+<project>
+ <body>
+ <menu name="Services">
+ <item name="Elasticsearch" href="../../services/elasticsearch.html"/>
+ </menu>
+ </body>
+</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/local/elasticsearch-reindex/src/site/site.xml
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/site/site.xml b/local/elasticsearch-reindex/src/site/site.xml
new file mode 100644
index 0000000..2520343
--- /dev/null
+++ b/local/elasticsearch-reindex/src/site/site.xml
@@ -0,0 +1,25 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one
+ ~ or more contributor license agreements. See the NOTICE file
+ ~ distributed with this work for additional information
+ ~ regarding copyright ownership. The ASF licenses this file
+ ~ to you under the Apache License, Version 2.0 (the
+ ~ "License"); you may not use this file except in compliance
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing,
+ ~ software distributed under the License is distributed on an
+ ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ ~ KIND, either express or implied. See the License for the
+ ~ specific language governing permissions and limitations
+ ~ under the License.
+ -->
+<project>
+ <body>
+ <menu name="Services">
+ <item name="Elasticsearch" href="../../services/elasticsearch.html"/>
+ </menu>
+ </body>
+</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/local/mongo-elasticsearch-sync/src/site/site.xml
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/site/site.xml b/local/mongo-elasticsearch-sync/src/site/site.xml
new file mode 100644
index 0000000..92baae0
--- /dev/null
+++ b/local/mongo-elasticsearch-sync/src/site/site.xml
@@ -0,0 +1,33 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one
+ ~ or more contributor license agreements. See the NOTICE file
+ ~ distributed with this work for additional information
+ ~ regarding copyright ownership. The ASF licenses this file
+ ~ to you under the Apache License, Version 2.0 (the
+ ~ "License"); you may not use this file except in compliance
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing,
+ ~ software distributed under the License is distributed on an
+ ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ ~ KIND, either express or implied. See the License for the
+ ~ specific language governing permissions and limitations
+ ~ under the License.
+ -->
+<project>
+ <custom>
+ <fluidoSkin>
+ <topBarEnabled>false</topBarEnabled>
+ <navBarStyle>navbar-inverse</navBarStyle>
+ <sideBarEnabled>true</sideBarEnabled>
+ </fluidoSkin>
+ </custom>
+ <body>
+ <menu name="Services">
+ <item name="Elasticsearch" href="../../services/elasticsearch.html"/>
+ <item name="Mongo" href="../../services/mongo.html"/>
+ </menu>
+ </body>
+</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/local/src/site/site.xml
----------------------------------------------------------------------
diff --git a/local/src/site/site.xml b/local/src/site/site.xml
new file mode 100644
index 0000000..f9fbfac
--- /dev/null
+++ b/local/src/site/site.xml
@@ -0,0 +1,27 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one
+ ~ or more contributor license agreements. See the NOTICE file
+ ~ distributed with this work for additional information
+ ~ regarding copyright ownership. The ASF licenses this file
+ ~ to you under the Apache License, Version 2.0 (the
+ ~ "License"); you may not use this file except in compliance
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing,
+ ~ software distributed under the License is distributed on an
+ ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ ~ KIND, either express or implied. See the License for the
+ ~ specific language governing permissions and limitations
+ ~ under the License.
+ -->
+<project>
+ <custom>
+ <fluidoSkin>
+ <topBarEnabled>false</topBarEnabled>
+ <navBarStyle>navbar-inverse</navBarStyle>
+ <sideBarEnabled>true</sideBarEnabled>
+ </fluidoSkin>
+ </custom>
+</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/local/twitter-follow-neo4j/src/site/site.xml
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/site.xml b/local/twitter-follow-neo4j/src/site/site.xml
index a25bae0..1d1471a 100644
--- a/local/twitter-follow-neo4j/src/site/site.xml
+++ b/local/twitter-follow-neo4j/src/site/site.xml
@@ -17,29 +17,12 @@
~ under the License.
-->
<project>
- <custom>
- <fluidoSkin>
- <topBarEnabled>false</topBarEnabled>
- <navBarStyle>navbar-inverse</navBarStyle>
- <sideBarEnabled>true</sideBarEnabled>
- <!--<gitHub>-->
- <!--<projectId>apache/incubator-streams-examples</projectId>-->
- <!--<ribbonOrientation>right</ribbonOrientation>-->
- <!--<ribbonColor>black</ribbonColor>-->
- <!--</gitHub>-->
- <!--<twitter>-->
- <!--<user>ApacheStreams</user>-->
- <!--<showUser>true</showUser>-->
- <!--<showFollowers>true</showFollowers>-->
- <!--</twitter>-->
- </fluidoSkin>
- </custom>
<body>
- <menu name="Configuration">
- <item name="Neo4j" href="../../services/neo4j.html"/>
- </menu>
<menu name="Credentials">
<item name="Twitter" href="../../credentials/twitter.html"/>
</menu>
+ <menu name="Services">
+ <item name="Neo4j" href="../../services/neo4j.html"/>
+ </menu>
</body>
</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/local/twitter-history-elasticsearch/src/site/site.xml
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/site/site.xml b/local/twitter-history-elasticsearch/src/site/site.xml
new file mode 100644
index 0000000..0fafb0e
--- /dev/null
+++ b/local/twitter-history-elasticsearch/src/site/site.xml
@@ -0,0 +1,45 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one
+ ~ or more contributor license agreements. See the NOTICE file
+ ~ distributed with this work for additional information
+ ~ regarding copyright ownership. The ASF licenses this file
+ ~ to you under the Apache License, Version 2.0 (the
+ ~ "License"); you may not use this file except in compliance
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing,
+ ~ software distributed under the License is distributed on an
+ ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ ~ KIND, either express or implied. See the License for the
+ ~ specific language governing permissions and limitations
+ ~ under the License.
+ -->
+<project>
+ <custom>
+ <fluidoSkin>
+ <topBarEnabled>false</topBarEnabled>
+ <navBarStyle>navbar-inverse</navBarStyle>
+ <sideBarEnabled>true</sideBarEnabled>
+ <!--<gitHub>-->
+ <!--<projectId>apache/incubator-streams-examples</projectId>-->
+ <!--<ribbonOrientation>right</ribbonOrientation>-->
+ <!--<ribbonColor>black</ribbonColor>-->
+ <!--</gitHub>-->
+ <!--<twitter>-->
+ <!--<user>ApacheStreams</user>-->
+ <!--<showUser>true</showUser>-->
+ <!--<showFollowers>true</showFollowers>-->
+ <!--</twitter>-->
+ </fluidoSkin>
+ </custom>
+ <body>
+ <menu name="Credentials">
+ <item name="Twitter" href="../../credentials/twitter.html"/>
+ </menu>
+ <menu name="Services">
+ <item name="Elasticsearch" href="../../services/elasticsearch.html"/>
+ </menu>
+ </body>
+</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/3c1fbdee/local/twitter-userstream-elasticsearch/src/site/site.xml
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/site/site.xml b/local/twitter-userstream-elasticsearch/src/site/site.xml
new file mode 100644
index 0000000..0fafb0e
--- /dev/null
+++ b/local/twitter-userstream-elasticsearch/src/site/site.xml
@@ -0,0 +1,45 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one
+ ~ or more contributor license agreements. See the NOTICE file
+ ~ distributed with this work for additional information
+ ~ regarding copyright ownership. The ASF licenses this file
+ ~ to you under the Apache License, Version 2.0 (the
+ ~ "License"); you may not use this file except in compliance
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing,
+ ~ software distributed under the License is distributed on an
+ ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ ~ KIND, either express or implied. See the License for the
+ ~ specific language governing permissions and limitations
+ ~ under the License.
+ -->
+<project>
+ <custom>
+ <fluidoSkin>
+ <topBarEnabled>false</topBarEnabled>
+ <navBarStyle>navbar-inverse</navBarStyle>
+ <sideBarEnabled>true</sideBarEnabled>
+ <!--<gitHub>-->
+ <!--<projectId>apache/incubator-streams-examples</projectId>-->
+ <!--<ribbonOrientation>right</ribbonOrientation>-->
+ <!--<ribbonColor>black</ribbonColor>-->
+ <!--</gitHub>-->
+ <!--<twitter>-->
+ <!--<user>ApacheStreams</user>-->
+ <!--<showUser>true</showUser>-->
+ <!--<showFollowers>true</showFollowers>-->
+ <!--</twitter>-->
+ </fluidoSkin>
+ </custom>
+ <body>
+ <menu name="Credentials">
+ <item name="Twitter" href="../../credentials/twitter.html"/>
+ </menu>
+ <menu name="Services">
+ <item name="Elasticsearch" href="../../services/elasticsearch.html"/>
+ </menu>
+ </body>
+</project>
\ No newline at end of file
[3/9] incubator-streams-examples git commit: normalize package names
in streams-examples/local
Posted by sb...@apache.org.
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexIT.java b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexIT.java
new file mode 100644
index 0000000..3fee0d7
--- /dev/null
+++ b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexIT.java
@@ -0,0 +1,120 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example.test;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.typesafe.config.Config;
+import com.typesafe.config.ConfigFactory;
+import com.typesafe.config.ConfigParseOptions;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfiguration;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.ElasticsearchClientManager;
+import org.apache.streams.example.ElasticsearchReindex;
+import org.apache.streams.example.ElasticsearchReindexConfiguration;
+import org.apache.streams.jackson.StreamsJacksonMapper;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
+import org.elasticsearch.action.search.SearchRequestBuilder;
+import org.elasticsearch.action.search.SearchResponse;
+import org.elasticsearch.client.Client;
+import org.elasticsearch.client.Requests;
+import org.elasticsearch.cluster.health.ClusterHealthStatus;
+import org.junit.Before;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.InputStream;
+import java.util.Properties;
+
+import static junit.framework.TestCase.assertTrue;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotEquals;
+
+/**
+ * Test copying documents between two indexes on same cluster
+ */
+public class ElasticsearchReindexIT {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchReindexIT.class);
+
+ ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
+
+ protected ElasticsearchReindexConfiguration testConfiguration;
+ protected Client testClient;
+
+ private int count = 0;
+
+ @Before
+ public void prepareTest() throws Exception {
+
+ Config reference = ConfigFactory.load();
+ File conf_file = new File("target/test-classes/ElasticsearchReindexIT.conf");
+ assert(conf_file.exists());
+ Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
+ Properties es_properties = new Properties();
+ InputStream es_stream = new FileInputStream("elasticsearch.properties");
+ es_properties.load(es_stream);
+ Config esProps = ConfigFactory.parseProperties(es_properties);
+ Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
+ StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
+ testConfiguration = new ComponentConfigurator<>(ElasticsearchReindexConfiguration.class).detectConfiguration(typesafe);
+ testClient = new ElasticsearchClientManager(testConfiguration.getSource()).getClient();
+
+ ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
+ ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
+ assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getSource().getIndexes().get(0));
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ assertTrue(indicesExistsResponse.isExists());
+
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getSource().getIndexes().get(0))
+ .setTypes(testConfiguration.getSource().getTypes().get(0));
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ count = (int)countResponse.getHits().getTotalHits();
+
+ assertNotEquals(count, 0);
+
+ }
+
+ @Test
+ public void testReindex() throws Exception {
+
+ ElasticsearchReindex reindex = new ElasticsearchReindex(testConfiguration);
+
+ reindex.run();
+
+ // assert lines in file
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getDestination().getIndex())
+ .setTypes(testConfiguration.getDestination().getType());
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ assertEquals(count, (int)countResponse.getHits().getTotalHits());
+
+ }
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexParentIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexParentIT.java b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexParentIT.java
new file mode 100644
index 0000000..fc80453
--- /dev/null
+++ b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexParentIT.java
@@ -0,0 +1,133 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example.test;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+import com.typesafe.config.Config;
+import com.typesafe.config.ConfigFactory;
+import com.typesafe.config.ConfigParseOptions;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfiguration;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.ElasticsearchClientManager;
+import org.apache.streams.example.ElasticsearchReindex;
+import org.apache.streams.example.ElasticsearchReindexConfiguration;
+import org.apache.streams.elasticsearch.test.ElasticsearchParentChildWriterIT;
+import org.apache.streams.jackson.StreamsJacksonMapper;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
+import org.elasticsearch.action.admin.indices.template.put.PutIndexTemplateRequestBuilder;
+import org.elasticsearch.action.search.SearchRequestBuilder;
+import org.elasticsearch.action.search.SearchResponse;
+import org.elasticsearch.client.Client;
+import org.elasticsearch.client.Requests;
+import org.elasticsearch.cluster.health.ClusterHealthStatus;
+import org.junit.Before;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.InputStream;
+import java.net.URL;
+import java.util.Properties;
+
+import static junit.framework.TestCase.assertTrue;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotEquals;
+
+/**
+ * Test copying parent/child associated documents between two indexes on same cluster
+ */
+public class ElasticsearchReindexParentIT {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchReindexIT.class);
+
+ ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
+
+ protected ElasticsearchReindexConfiguration testConfiguration;
+ protected Client testClient;
+
+ private int count = 0;
+
+ @Before
+ public void prepareTest() throws Exception {
+
+ Config reference = ConfigFactory.load();
+ File conf_file = new File("target/test-classes/ElasticsearchReindexParentIT.conf");
+ assert(conf_file.exists());
+ Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
+ Properties es_properties = new Properties();
+ InputStream es_stream = new FileInputStream("elasticsearch.properties");
+ es_properties.load(es_stream);
+ Config esProps = ConfigFactory.parseProperties(es_properties);
+ Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
+ StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
+ testConfiguration = new ComponentConfigurator<>(ElasticsearchReindexConfiguration.class).detectConfiguration(typesafe);
+ testClient = new ElasticsearchClientManager(testConfiguration.getSource()).getClient();
+
+ ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
+ ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
+ assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getSource().getIndexes().get(0));
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ assertTrue(indicesExistsResponse.isExists());
+
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getSource().getIndexes().get(0))
+ .setTypes(testConfiguration.getSource().getTypes().get(0));
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ count = (int)countResponse.getHits().getTotalHits();
+
+ PutIndexTemplateRequestBuilder putTemplateRequestBuilder = testClient.admin().indices().preparePutTemplate("mappings");
+ URL templateURL = ElasticsearchParentChildWriterIT.class.getResource("/ActivityChildObjectParent.json");
+ ObjectNode template = MAPPER.readValue(templateURL, ObjectNode.class);
+ String templateSource = MAPPER.writeValueAsString(template);
+ putTemplateRequestBuilder.setSource(templateSource);
+
+ testClient.admin().indices().putTemplate(putTemplateRequestBuilder.request()).actionGet();
+
+ assertNotEquals(count, 0);
+
+ }
+
+ @Test
+ public void testReindex() throws Exception {
+
+ ElasticsearchReindex reindex = new ElasticsearchReindex(testConfiguration);
+
+ reindex.run();
+
+ // assert lines in file
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getDestination().getIndex())
+ .setTypes(testConfiguration.getDestination().getType());
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ assertEquals(count, (int)countResponse.getHits().getTotalHits());
+
+ }
+
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ReindexITs.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ReindexITs.java b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ReindexITs.java
new file mode 100644
index 0000000..ee79224
--- /dev/null
+++ b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ReindexITs.java
@@ -0,0 +1,20 @@
+package org.apache.streams.example.test;
+
+import org.apache.streams.elasticsearch.test.ElasticsearchParentChildWriterIT;
+import org.apache.streams.elasticsearch.test.ElasticsearchPersistWriterIT;
+import org.junit.runner.RunWith;
+import org.junit.runners.Suite;
+
+@RunWith(Suite.class)
+@Suite.SuiteClasses({
+ ElasticsearchPersistWriterIT.class,
+ ElasticsearchParentChildWriterIT.class,
+ ElasticsearchReindexIT.class,
+ ElasticsearchReindexParentIT.class,
+ ElasticsearchReindexChildIT.class
+})
+
+public class ReindexITs {
+ // the class remains empty,
+ // used only as a holder for the above annotations
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/mongo-elasticsearch-sync/src/main/java/org/apache/streams/elasticsearch/example/MongoElasticsearchSync.java
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/main/java/org/apache/streams/elasticsearch/example/MongoElasticsearchSync.java b/local/mongo-elasticsearch-sync/src/main/java/org/apache/streams/elasticsearch/example/MongoElasticsearchSync.java
deleted file mode 100644
index f77ecce..0000000
--- a/local/mongo-elasticsearch-sync/src/main/java/org/apache/streams/elasticsearch/example/MongoElasticsearchSync.java
+++ /dev/null
@@ -1,79 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.elasticsearch.example;
-
-import com.google.common.collect.Maps;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.elasticsearch.*;
-import org.apache.streams.core.StreamBuilder;
-import org.apache.streams.example.elasticsearch.MongoElasticsearchSyncConfiguration;
-import org.apache.streams.local.builders.LocalStreamBuilder;
-import org.apache.streams.mongo.MongoPersistReader;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.util.Map;
-
-/**
- * Copies documents into a new index
- */
-public class MongoElasticsearchSync implements Runnable {
-
- public final static String STREAMS_ID = "MongoElasticsearchSync";
-
- private final static Logger LOGGER = LoggerFactory.getLogger(MongoElasticsearchSync.class);
-
- MongoElasticsearchSyncConfiguration config;
-
- public MongoElasticsearchSync() {
- this(new ComponentConfigurator<MongoElasticsearchSyncConfiguration>(MongoElasticsearchSyncConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
- }
-
- public MongoElasticsearchSync(MongoElasticsearchSyncConfiguration config) {
- this.config = config;
- }
-
- public static void main(String[] args)
- {
- LOGGER.info(StreamsConfigurator.config.toString());
-
- MongoElasticsearchSync sync = new MongoElasticsearchSync();
-
- new Thread(sync).start();
-
- }
-
- @Override
- public void run() {
-
- MongoPersistReader mongoPersistReader = new MongoPersistReader(config.getSource());
-
- ElasticsearchPersistWriter elasticsearchPersistWriter = new ElasticsearchPersistWriter(config.getDestination());
-
- Map<String, Object> streamConfig = Maps.newHashMap();
- streamConfig.put(LocalStreamBuilder.STREAM_IDENTIFIER_KEY, STREAMS_ID);
- streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 7 * 24 * 60 * 1000);
- StreamBuilder builder = new LocalStreamBuilder(1000, streamConfig);
-
- builder.newPerpetualStream(MongoPersistReader.STREAMS_ID, mongoPersistReader);
- builder.addStreamsPersistWriter(ElasticsearchPersistWriter.STREAMS_ID, elasticsearchPersistWriter, 1, MongoPersistReader.STREAMS_ID);
- builder.start();
- }
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/mongo-elasticsearch-sync/src/main/java/org/apache/streams/example/MongoElasticsearchSync.java
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/main/java/org/apache/streams/example/MongoElasticsearchSync.java b/local/mongo-elasticsearch-sync/src/main/java/org/apache/streams/example/MongoElasticsearchSync.java
new file mode 100644
index 0000000..e89318c
--- /dev/null
+++ b/local/mongo-elasticsearch-sync/src/main/java/org/apache/streams/example/MongoElasticsearchSync.java
@@ -0,0 +1,79 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example;
+
+import com.google.common.collect.Maps;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.*;
+import org.apache.streams.core.StreamBuilder;
+import org.apache.streams.example.MongoElasticsearchSyncConfiguration;
+import org.apache.streams.local.builders.LocalStreamBuilder;
+import org.apache.streams.mongo.MongoPersistReader;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.util.Map;
+
+/**
+ * Copies documents into a new index
+ */
+public class MongoElasticsearchSync implements Runnable {
+
+ public final static String STREAMS_ID = "MongoElasticsearchSync";
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(MongoElasticsearchSync.class);
+
+ MongoElasticsearchSyncConfiguration config;
+
+ public MongoElasticsearchSync() {
+ this(new ComponentConfigurator<MongoElasticsearchSyncConfiguration>(MongoElasticsearchSyncConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
+ }
+
+ public MongoElasticsearchSync(MongoElasticsearchSyncConfiguration config) {
+ this.config = config;
+ }
+
+ public static void main(String[] args)
+ {
+ LOGGER.info(StreamsConfigurator.config.toString());
+
+ MongoElasticsearchSync sync = new MongoElasticsearchSync();
+
+ new Thread(sync).start();
+
+ }
+
+ @Override
+ public void run() {
+
+ MongoPersistReader mongoPersistReader = new MongoPersistReader(config.getSource());
+
+ ElasticsearchPersistWriter elasticsearchPersistWriter = new ElasticsearchPersistWriter(config.getDestination());
+
+ Map<String, Object> streamConfig = Maps.newHashMap();
+ streamConfig.put(LocalStreamBuilder.STREAM_IDENTIFIER_KEY, STREAMS_ID);
+ streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 7 * 24 * 60 * 1000);
+ StreamBuilder builder = new LocalStreamBuilder(1000, streamConfig);
+
+ builder.newPerpetualStream(MongoPersistReader.STREAMS_ID, mongoPersistReader);
+ builder.addStreamsPersistWriter(ElasticsearchPersistWriter.STREAMS_ID, elasticsearchPersistWriter, 1, MongoPersistReader.STREAMS_ID);
+ builder.start();
+ }
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/mongo-elasticsearch-sync/src/main/jsonschema/MongoElasticsearchSyncConfiguration.json
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/main/jsonschema/MongoElasticsearchSyncConfiguration.json b/local/mongo-elasticsearch-sync/src/main/jsonschema/MongoElasticsearchSyncConfiguration.json
index 8f9fed2..0065468 100644
--- a/local/mongo-elasticsearch-sync/src/main/jsonschema/MongoElasticsearchSyncConfiguration.json
+++ b/local/mongo-elasticsearch-sync/src/main/jsonschema/MongoElasticsearchSyncConfiguration.json
@@ -4,7 +4,7 @@
"http://www.apache.org/licenses/LICENSE-2.0"
],
"type": "object",
- "javaType" : "org.apache.streams.example.elasticsearch.MongoElasticsearchSyncConfiguration",
+ "javaType" : "org.apache.streams.example.MongoElasticsearchSyncConfiguration",
"javaInterfaces": ["java.io.Serializable"],
"properties": {
"source": { "javaType": "org.apache.streams.mongo.MongoConfiguration", "type": "object", "required": true },
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/mongodb/test/MongoElasticsearchSyncIT.java
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/mongodb/test/MongoElasticsearchSyncIT.java b/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/mongodb/test/MongoElasticsearchSyncIT.java
deleted file mode 100644
index 5ebc204..0000000
--- a/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/mongodb/test/MongoElasticsearchSyncIT.java
+++ /dev/null
@@ -1,121 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.example.mongodb.test;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.typesafe.config.Config;
-import com.typesafe.config.ConfigFactory;
-import com.typesafe.config.ConfigParseOptions;
-import org.apache.commons.io.Charsets;
-import org.apache.commons.io.IOUtils;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.core.StreamsDatum;
-import org.apache.streams.elasticsearch.ElasticsearchClientManager;
-import org.apache.streams.elasticsearch.example.MongoElasticsearchSync;
-import org.apache.streams.example.elasticsearch.MongoElasticsearchSyncConfiguration;
-import org.apache.streams.jackson.StreamsJacksonMapper;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
-import org.elasticsearch.action.search.SearchRequestBuilder;
-import org.elasticsearch.action.search.SearchResponse;
-import org.elasticsearch.client.Client;
-import org.elasticsearch.client.Requests;
-import org.elasticsearch.cluster.health.ClusterHealthStatus;
-import org.junit.Before;
-import org.junit.Test;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.InputStream;
-import java.util.List;
-import java.util.Properties;
-
-import static junit.framework.TestCase.assertTrue;
-import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertFalse;
-import static org.junit.Assert.assertNotEquals;
-
-/**
- * Test copying documents between two indexes on same cluster
- */
-public class MongoElasticsearchSyncIT {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(MongoElasticsearchSyncIT.class);
-
- ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
-
- protected MongoElasticsearchSyncConfiguration testConfiguration;
- protected Client testClient;
-
- @Before
- public void prepareTest() throws Exception {
-
- Config reference = ConfigFactory.load();
- File conf_file = new File("target/test-classes/MongoElasticsearchSyncIT.conf");
- assert(conf_file.exists());
- Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
- Properties mongo_properties = new Properties();
- InputStream mongo_stream = new FileInputStream("mongo.properties");
- mongo_properties.load(mongo_stream);
- Config mongoProps = ConfigFactory.parseProperties(mongo_properties);
- Properties es_properties = new Properties();
- InputStream es_stream = new FileInputStream("elasticsearch.properties");
- es_properties.load(es_stream);
- Config esProps = ConfigFactory.parseProperties(es_properties);
- Config typesafe = testResourceConfig.withFallback(mongoProps).withFallback(esProps).withFallback(reference).resolve();
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
- testConfiguration = new ComponentConfigurator<>(MongoElasticsearchSyncConfiguration.class).detectConfiguration(typesafe);
- testClient = new ElasticsearchClientManager(testConfiguration.getDestination()).getClient();
-
- ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
- ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
- assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getDestination().getIndex());
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- assertFalse(indicesExistsResponse.isExists());
- }
-
- @Test
- public void testSync() throws Exception {
-
- MongoElasticsearchSync sync = new MongoElasticsearchSync(testConfiguration);
-
- sync.run();
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getDestination().getIndex());
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- assertTrue(indicesExistsResponse.isExists());
-
- // assert lines in file
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getDestination().getIndex())
- .setTypes(testConfiguration.getDestination().getType());
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- assertEquals(89, (int)countResponse.getHits().getTotalHits());
-
- }
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/mongodb/test/SyncITs.java
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/mongodb/test/SyncITs.java b/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/mongodb/test/SyncITs.java
deleted file mode 100644
index 7ba67a5..0000000
--- a/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/mongodb/test/SyncITs.java
+++ /dev/null
@@ -1,16 +0,0 @@
-package org.apache.streams.example.mongodb.test;
-
-import org.apache.streams.mongo.test.MongoPersistIT;
-import org.junit.runner.RunWith;
-import org.junit.runners.Suite;
-
-@RunWith(Suite.class)
-@Suite.SuiteClasses({
- MongoPersistIT.class,
- MongoElasticsearchSyncIT.class
-})
-
-public class SyncITs {
- // the class remains empty,
- // used only as a holder for the above annotations
-}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/test/MongoElasticsearchSyncIT.java
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/test/MongoElasticsearchSyncIT.java b/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/test/MongoElasticsearchSyncIT.java
new file mode 100644
index 0000000..47851f3
--- /dev/null
+++ b/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/test/MongoElasticsearchSyncIT.java
@@ -0,0 +1,117 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example.test;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.typesafe.config.Config;
+import com.typesafe.config.ConfigFactory;
+import com.typesafe.config.ConfigParseOptions;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfiguration;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.ElasticsearchClientManager;
+import org.apache.streams.example.MongoElasticsearchSync;
+import org.apache.streams.example.MongoElasticsearchSyncConfiguration;
+import org.apache.streams.jackson.StreamsJacksonMapper;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
+import org.elasticsearch.action.search.SearchRequestBuilder;
+import org.elasticsearch.action.search.SearchResponse;
+import org.elasticsearch.client.Client;
+import org.elasticsearch.client.Requests;
+import org.elasticsearch.cluster.health.ClusterHealthStatus;
+import org.junit.Before;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.InputStream;
+import java.util.Properties;
+
+import static junit.framework.TestCase.assertTrue;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertNotEquals;
+
+/**
+ * Test copying documents between two indexes on same cluster
+ */
+public class MongoElasticsearchSyncIT {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(MongoElasticsearchSyncIT.class);
+
+ ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
+
+ protected MongoElasticsearchSyncConfiguration testConfiguration;
+ protected Client testClient;
+
+ @Before
+ public void prepareTest() throws Exception {
+
+ Config reference = ConfigFactory.load();
+ File conf_file = new File("target/test-classes/MongoElasticsearchSyncIT.conf");
+ assert(conf_file.exists());
+ Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
+ Properties mongo_properties = new Properties();
+ InputStream mongo_stream = new FileInputStream("mongo.properties");
+ mongo_properties.load(mongo_stream);
+ Config mongoProps = ConfigFactory.parseProperties(mongo_properties);
+ Properties es_properties = new Properties();
+ InputStream es_stream = new FileInputStream("elasticsearch.properties");
+ es_properties.load(es_stream);
+ Config esProps = ConfigFactory.parseProperties(es_properties);
+ Config typesafe = testResourceConfig.withFallback(mongoProps).withFallback(esProps).withFallback(reference).resolve();
+ StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
+ testConfiguration = new ComponentConfigurator<>(MongoElasticsearchSyncConfiguration.class).detectConfiguration(typesafe);
+ testClient = new ElasticsearchClientManager(testConfiguration.getDestination()).getClient();
+
+ ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
+ ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
+ assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getDestination().getIndex());
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ assertFalse(indicesExistsResponse.isExists());
+ }
+
+ @Test
+ public void testSync() throws Exception {
+
+ MongoElasticsearchSync sync = new MongoElasticsearchSync(testConfiguration);
+
+ sync.run();
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getDestination().getIndex());
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ assertTrue(indicesExistsResponse.isExists());
+
+ // assert lines in file
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getDestination().getIndex())
+ .setTypes(testConfiguration.getDestination().getType());
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ assertEquals(89, (int)countResponse.getHits().getTotalHits());
+
+ }
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/test/SyncITs.java
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/test/SyncITs.java b/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/test/SyncITs.java
new file mode 100644
index 0000000..cb8af91
--- /dev/null
+++ b/local/mongo-elasticsearch-sync/src/test/java/org/apache/streams/example/test/SyncITs.java
@@ -0,0 +1,16 @@
+package org.apache.streams.example.test;
+
+import org.apache.streams.mongo.test.MongoPersistIT;
+import org.junit.runner.RunWith;
+import org.junit.runners.Suite;
+
+@RunWith(Suite.class)
+@Suite.SuiteClasses({
+ MongoPersistIT.class,
+ MongoElasticsearchSyncIT.class
+})
+
+public class SyncITs {
+ // the class remains empty,
+ // used only as a holder for the above annotations
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/mongo-elasticsearch-sync/src/test/resources/testSync.json
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/test/resources/testSync.json b/local/mongo-elasticsearch-sync/src/test/resources/testSync.json
deleted file mode 100644
index 8a77262..0000000
--- a/local/mongo-elasticsearch-sync/src/test/resources/testSync.json
+++ /dev/null
@@ -1,21 +0,0 @@
-{
- "$license": [
- "http://www.apache.org/licenses/LICENSE-2.0"
- ],
- "source": {
- "host": "localhost",
- "port": 37017,
- "db": "local",
- "collection": "activities"
- },
- "destination": {
- "hosts": [
- "localhost"
- ],
- "port": 9300,
- "clusterName": "elasticsearch",
- "index": "destination",
- "type": "activity",
- "forceUseConfig": true
- }
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/pom.xml
----------------------------------------------------------------------
diff --git a/local/pom.xml b/local/pom.xml
index fbde938..e515606 100644
--- a/local/pom.xml
+++ b/local/pom.xml
@@ -42,12 +42,9 @@
<module>elasticsearch-hdfs</module>
<module>elasticsearch-reindex</module>
<module>mongo-elasticsearch-sync</module>
- <module>twitter-follow-graph</module>
+ <module>twitter-follow-neo4j</module>
<module>twitter-history-elasticsearch</module>
<module>twitter-userstream-elasticsearch</module>
</modules>
- <build>
-
- </build>
</project>
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-graph/README.md
----------------------------------------------------------------------
diff --git a/local/twitter-follow-graph/README.md b/local/twitter-follow-graph/README.md
deleted file mode 100644
index 3e63a53..0000000
--- a/local/twitter-follow-graph/README.md
+++ /dev/null
@@ -1,8 +0,0 @@
-Apache Streams (incubating)
-Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
---------------------------------------------------------------------------------
-
-org.apache.streams:twitter-follow-graph
-=======================================
-
-[README.md](src/site/markdown/index.md "README")
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-graph/pom.xml
----------------------------------------------------------------------
diff --git a/local/twitter-follow-graph/pom.xml b/local/twitter-follow-graph/pom.xml
deleted file mode 100644
index 4ce6a64..0000000
--- a/local/twitter-follow-graph/pom.xml
+++ /dev/null
@@ -1,316 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
- Licensed to the Apache Software Foundation (ASF) under one
- or more contributor license agreements. See the NOTICE file
- distributed with this work for additional information
- regarding copyright ownership. The ASF licenses this file
- to you under the Apache License, Version 2.0 (the
- "License"); you may not use this file except in compliance
- with the License. You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing,
- software distributed under the License is distributed on an
- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- KIND, either express or implied. See the License for the
- specific language governing permissions and limitations
- under the License.
--->
-<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
- <parent>
- <groupId>org.apache.streams</groupId>
- <artifactId>streams-examples-local</artifactId>
- <version>0.4-incubating-SNAPSHOT</version>
- <relativePath>..</relativePath>
- </parent>
- <modelVersion>4.0.0</modelVersion>
-
- <artifactId>twitter-follow-graph</artifactId>
- <name>twitter-follow-graph</name>
-
- <description>
- Collects friend or follower connections for a set of twitter users to build a graph database in neo4j.
- </description>
-
- <properties>
- <docker.repo>apachestreams</docker.repo>
- </properties>
-
- <dependencies>
- <dependency>
- <groupId>com.typesafe</groupId>
- <artifactId>config</artifactId>
- </dependency>
- <dependency>
- <groupId>org.apache.streams</groupId>
- <artifactId>streams-core</artifactId>
- </dependency>
- <dependency>
- <groupId>org.apache.streams</groupId>
- <artifactId>streams-config</artifactId>
- <version>0.4-incubating-SNAPSHOT</version>
- </dependency>
- <dependency>
- <groupId>org.apache.streams</groupId>
- <artifactId>streams-runtime-local</artifactId>
- <version>0.4-incubating-SNAPSHOT</version>
- </dependency>
- <dependency>
- <groupId>org.apache.streams</groupId>
- <artifactId>streams-provider-twitter</artifactId>
- <version>0.4-incubating-SNAPSHOT</version>
- <exclusions>
- <exclusion>
- <groupId>commons-logging</groupId>
- <artifactId>commons-logging</artifactId>
- </exclusion>
- </exclusions>
- </dependency>
- <dependency>
- <groupId>org.apache.streams</groupId>
- <artifactId>streams-persist-graph</artifactId>
- <version>0.4-incubating-SNAPSHOT</version>
- </dependency>
- <dependency>
- <groupId>org.apache.streams</groupId>
- <artifactId>streams-pojo</artifactId>
- <version>0.4-incubating-SNAPSHOT</version>
- <type>test-jar</type>
- </dependency>
- <dependency>
- <groupId>org.slf4j</groupId>
- <artifactId>log4j-over-slf4j</artifactId>
- <version>${slf4j.version}</version>
- </dependency>
- <dependency>
- <groupId>org.slf4j</groupId>
- <artifactId>jcl-over-slf4j</artifactId>
- <version>${slf4j.version}</version>
- </dependency>
- <dependency>
- <groupId>org.slf4j</groupId>
- <artifactId>jul-to-slf4j</artifactId>
- <version>${slf4j.version}</version>
- </dependency>
- <dependency>
- <groupId>ch.qos.logback</groupId>
- <artifactId>logback-classic</artifactId>
- <version>${logback.version}</version>
- </dependency>
- <dependency>
- <groupId>ch.qos.logback</groupId>
- <artifactId>logback-core</artifactId>
- <version>${logback.version}</version>
- </dependency>
- </dependencies>
-
- <build>
- <sourceDirectory>src/main/java</sourceDirectory>
- <testSourceDirectory>src/test/java</testSourceDirectory>
- <resources>
- <resource>
- <directory>src/main/resources</directory>
- </resource>
- </resources>
- <testResources>
- <testResource>
- <directory>src/test/resources</directory>
- </testResource>
- </testResources>
- <plugins>
- <!-- This binary runs with logback -->
- <!-- Keep log4j out -->
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-enforcer-plugin</artifactId>
- <version>1.3.1</version>
- <executions>
- <execution>
- <id>enforce-banned-dependencies</id>
- <goals>
- <goal>enforce</goal>
- </goals>
- <configuration>
- <rules>
- <bannedDependencies>
- <excludes>
- <exclude>org.slf4j:slf4j-log4j12</exclude>
- <exclude>org.slf4j:slf4j-jcl</exclude>
- <exclude>org.slf4j:slf4j-jdk14</exclude>
- <exclude>org.log4j:log4j</exclude>
- <exclude>commons-logging:commons-logging</exclude>
- </excludes>
- </bannedDependencies>
- </rules>
- <fail>true</fail>
- </configuration>
- </execution>
- </executions>
- </plugin>
- <plugin>
- <artifactId>maven-clean-plugin</artifactId>
- <configuration>
- <filesets>
- <fileset>
- <directory>data</directory>
- <followSymlinks>false</followSymlinks>
- </fileset>
- </filesets>
- </configuration>
- </plugin>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-shade-plugin</artifactId>
- </plugin>
- <plugin>
- <groupId>org.jsonschema2pojo</groupId>
- <artifactId>jsonschema2pojo-maven-plugin</artifactId>
- <version>0.4.6</version>
- <configuration>
- <addCompileSourceRoot>true</addCompileSourceRoot>
- <generateBuilders>true</generateBuilders>
- <sourcePaths>
- <sourcePath>src/main/jsonschema</sourcePath>
- </sourcePaths>
- <outputDirectory>target/generated-sources/jsonschema2pojo</outputDirectory>
- <targetPackage>org.apache.streams.example.elasticsearch</targetPackage>
- <useJodaDates>false</useJodaDates>
- </configuration>
- <executions>
- <execution>
- <goals>
- <goal>generate</goal>
- </goals>
- </execution>
- </executions>
- </plugin>
- <plugin>
- <groupId>org.codehaus.mojo</groupId>
- <artifactId>build-helper-maven-plugin</artifactId>
- <executions>
- <execution>
- <id>add-source</id>
- <phase>generate-sources</phase>
- <goals>
- <goal>add-source</goal>
- </goals>
- <configuration>
- <sources>
- <source>target/generated-sources/jsonschema2pojo</source>
- </sources>
- </configuration>
- </execution>
- </executions>
- </plugin>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-dependency-plugin</artifactId>
- <version>2.4</version>
- <executions>
- <execution>
- <id>resource-dependencies</id>
- <phase>process-test-resources</phase>
- <goals>
- <goal>unpack-dependencies</goal>
- </goals>
- <configuration>
- <includeArtifactIds>streams-pojo</includeArtifactIds>
- <includes>**/*.json</includes>
- <outputDirectory>${project.build.directory}/test-classes</outputDirectory>
- </configuration>
- </execution>
- </executions>
- </plugin>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-failsafe-plugin</artifactId>
- <version>2.12.4</version>
- <executions>
- <execution>
- <id>integration-tests</id>
- <goals>
- <goal>integration-test</goal>
- <goal>verify</goal>
- </goals>
- </execution>
- </executions>
- </plugin>
- </plugins>
- </build>
-
- <profiles>
- <profile>
- <id>dockerITs</id>
- <activation>
- <activeByDefault>false</activeByDefault>
- <property>
- <name>skipITs</name>
- <value>false</value>
- </property>
- </activation>
- <build>
- <plugins>
- <plugin>
- <groupId>io.fabric8</groupId>
- <artifactId>docker-maven-plugin</artifactId>
- <version>${docker.plugin.version}</version>
- <configuration combine.self="override">
- <watchInterval>500</watchInterval>
- <logDate>default</logDate>
- <verbose>true</verbose>
- <autoPull>on</autoPull>
- <images>
- 
-
- </images>
- </configuration>
-
- </plugin>
-
- </plugins>
- </build>
-
- </profile>
- </profiles>
-
-</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-graph/src/main/java/org/apache/streams/example/graph/TwitterFollowGraph.java
----------------------------------------------------------------------
diff --git a/local/twitter-follow-graph/src/main/java/org/apache/streams/example/graph/TwitterFollowGraph.java b/local/twitter-follow-graph/src/main/java/org/apache/streams/example/graph/TwitterFollowGraph.java
deleted file mode 100644
index 11c52bb..0000000
--- a/local/twitter-follow-graph/src/main/java/org/apache/streams/example/graph/TwitterFollowGraph.java
+++ /dev/null
@@ -1,103 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.example.graph;
-
-import com.google.common.collect.Lists;
-import com.google.common.collect.Maps;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.converter.ActivityConverterProcessor;
-import org.apache.streams.converter.ActivityConverterProcessorConfiguration;
-import org.apache.streams.converter.TypeConverterProcessor;
-import org.apache.streams.core.StreamBuilder;
-import org.apache.streams.data.ActivityConverter;
-import org.apache.streams.data.DocumentClassifier;
-import org.apache.streams.graph.GraphHttpConfiguration;
-import org.apache.streams.graph.GraphHttpPersistWriter;
-import org.apache.streams.local.builders.LocalStreamBuilder;
-import org.apache.streams.twitter.TwitterFollowingConfiguration;
-import org.apache.streams.twitter.TwitterUserInformationConfiguration;
-import org.apache.streams.twitter.converter.TwitterFollowActivityConverter;
-import org.apache.streams.twitter.pojo.Follow;
-import org.apache.streams.twitter.provider.TwitterFollowingProvider;
-import org.apache.streams.twitter.converter.TwitterDocumentClassifier;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.util.Map;
-
-/**
- * Collects friend and follow connections for a set of twitter users and builds a graph
- * database in neo4j.
- */
-public class TwitterFollowGraph implements Runnable {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(TwitterFollowGraph.class);
-
- TwitterFollowGraphConfiguration config;
-
- public TwitterFollowGraph() {
- this(new ComponentConfigurator<>(TwitterFollowGraphConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
- }
-
- public TwitterFollowGraph(TwitterFollowGraphConfiguration config) {
- this.config = config;
- }
-
- public void run() {
-
- TwitterFollowingConfiguration twitterFollowingConfiguration = config.getTwitter();
- TwitterFollowingProvider followingProvider = new TwitterFollowingProvider(twitterFollowingConfiguration);
- TypeConverterProcessor converter = new TypeConverterProcessor(String.class);
-
- ActivityConverterProcessorConfiguration activityConverterProcessorConfiguration =
- new ActivityConverterProcessorConfiguration()
- .withClassifiers(Lists.newArrayList((DocumentClassifier) new TwitterDocumentClassifier()))
- .withConverters(Lists.newArrayList((ActivityConverter) new TwitterFollowActivityConverter()));
- ActivityConverterProcessor activity = new ActivityConverterProcessor(activityConverterProcessorConfiguration);
-
- GraphHttpConfiguration graphWriterConfiguration = config.getGraph();
- GraphHttpPersistWriter graphPersistWriter = new GraphHttpPersistWriter(graphWriterConfiguration);
-
- StreamBuilder builder = new LocalStreamBuilder();
- builder.newPerpetualStream(TwitterFollowingProvider.STREAMS_ID, followingProvider);
- builder.addStreamsProcessor("converter", converter, 1, TwitterFollowingProvider.STREAMS_ID);
- builder.addStreamsProcessor("activity", activity, 1, "converter");
- builder.addStreamsPersistWriter("graph", graphPersistWriter, 1, "activity");
-
- builder.start();
- }
-
- public static void main(String[] args) {
-
- LOGGER.info(StreamsConfigurator.config.toString());
-
- TwitterFollowGraph stream = new TwitterFollowGraph();
-
- stream.run();
-
- LOGGER.info(StreamsConfigurator.config.toString());
-
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration();
-
-
- }
-
-}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-graph/src/main/jsonschema/TwitterFollowGraphConfiguration.json
----------------------------------------------------------------------
diff --git a/local/twitter-follow-graph/src/main/jsonschema/TwitterFollowGraphConfiguration.json b/local/twitter-follow-graph/src/main/jsonschema/TwitterFollowGraphConfiguration.json
deleted file mode 100644
index f9c4ac1..0000000
--- a/local/twitter-follow-graph/src/main/jsonschema/TwitterFollowGraphConfiguration.json
+++ /dev/null
@@ -1,13 +0,0 @@
-{
- "$schema": "http://json-schema.org/draft-03/schema",
- "$license": [
- "http://www.apache.org/licenses/LICENSE-2.0"
- ],
- "type": "object",
- "javaType" : "org.apache.streams.example.graph.TwitterFollowGraphConfiguration",
- "javaInterfaces": ["java.io.Serializable"],
- "properties": {
- "twitter": { "javaType": "org.apache.streams.twitter.TwitterFollowingConfiguration", "type": "object", "required": true },
- "graph": { "javaType": "org.apache.streams.graph.GraphHttpConfiguration", "type": "object", "required": true }
- }
-}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-graph/src/main/resources/TwitterFollowGraph.dot
----------------------------------------------------------------------
diff --git a/local/twitter-follow-graph/src/main/resources/TwitterFollowGraph.dot b/local/twitter-follow-graph/src/main/resources/TwitterFollowGraph.dot
deleted file mode 100644
index 2d9e495..0000000
--- a/local/twitter-follow-graph/src/main/resources/TwitterFollowGraph.dot
+++ /dev/null
@@ -1,39 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
- digraph g {
-
- //providers
- TwitterFollowingProvider [label="TwitterFollowingProvider",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-provider-twitter/src/main/java/org/apache/streams/twitter/provider/TwitterFollowingProvider.java"];
-
- //processors
- TypeConverterProcessor [label="TypeConverterProcessor",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-components/streams-converters/src/main/java/org/apache/streams/converters/TypeConverterProcessor.java"];
- ActivityConverterProcessor [label="ActivityConverterProcessor",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-components/streams-converters/src/main/java/org/apache/streams/converters/ActivityConverterProcessor.java"];
-
- //persisters
- GraphPersistWriter [label="GraphPersistWriter",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-persist-graph/src/main/java/org/apache/streams/graph/GraphPersistWriter.java"];
-
- //data
- destination [label="http://{host}:{port}/db/data",shape=box];
-
- //stream
- TwitterFollowingProvider -> TypeConverterProcessor [label="Follow",URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-provider-twitter/src/main/jsonschema/com/twitter/Follow.java"];
- TypeConverterProcessor -> ActivityConverterProcessor [label="String"];
- ActivityConverterProcessor -> GraphPersistWriter [label="Activity",URL="https://github.com/apache/incubator-streams/blob/master/streams-pojo/src/main/jsonschema/org/apache/streams/pojo/json/activity.json"];
- GraphPersistWriter -> destination
-}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-graph/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-follow-graph/src/site/markdown/index.md b/local/twitter-follow-graph/src/site/markdown/index.md
deleted file mode 100644
index 3991688..0000000
--- a/local/twitter-follow-graph/src/site/markdown/index.md
+++ /dev/null
@@ -1,75 +0,0 @@
-twitter-follow-graph
-==============================
-
-Requirements:
--------------
- - Authorized Twitter API credentials
- - A running Neo4J 1.9.0+ instance
-
-Description:
-------------
-Collects friend or follower connections for a set of twitter users to build a graph database in neo4j.
-
-Specification:
------------------
-
-[TwitterFollowGraph.dot](TwitterFollowGraph.dot "TwitterFollowGraph.dot" )
-
-Diagram:
------------------
-
-![TwitterFollowGraph.dot.svg](./TwitterFollowGraph.dot.svg)
-
-Example Configuration:
-----------------------
-
-[testGraph.json](testGraph.json "testGraph.json" )
-
-Build:
----------
-
- mvn clean package verify
-
-Test:
------
-Create a local file `application.conf` with valid twitter credentials
-
- twitter {
- oauth {
- consumerKey = ""
- consumerSecret = ""
- accessToken = ""
- accessTokenSecret = ""
- }
- }
-
-Start up neo4j with docker:
-
- mvn -PdockerITs docker:start
-
-Build with integration testing enabled, using your credentials
-
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=`pwd`/application.conf"
-
-Shutdown neo4j when finished:
-
- mvn -PdockerITs docker:stop
-
-Run (Local):
-------------
-
- java -cp dist/twitter-follow-graph-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.example.graph.TwitterFollowGraph
-
-Deploy (Docker):
-----------------
-
- mvn -Pdocker -Ddocker.repo=<your docker host>:<your docker repo> docker:build docker:push
-
-Run (Docker):
--------------
-
- docker run twitter-follow-graph java -cp twitter-follow-graph-jar-with-dependencies.jar -Dconfig.url=http://<location_of_config_file> org.apache.streams.elasticsearch.example.TwitterFollowGraph
-
-[JavaDocs](apidocs/index.html "JavaDocs")
-
-###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-graph/src/test/java/org/apache/streams/twitter/example/TwitterFollowGraphIT.java
----------------------------------------------------------------------
diff --git a/local/twitter-follow-graph/src/test/java/org/apache/streams/twitter/example/TwitterFollowGraphIT.java b/local/twitter-follow-graph/src/test/java/org/apache/streams/twitter/example/TwitterFollowGraphIT.java
deleted file mode 100644
index c5254fe..0000000
--- a/local/twitter-follow-graph/src/test/java/org/apache/streams/twitter/example/TwitterFollowGraphIT.java
+++ /dev/null
@@ -1,79 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.twitter.example;
-
-import com.typesafe.config.Config;
-import com.typesafe.config.ConfigFactory;
-import com.typesafe.config.ConfigParseOptions;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.example.graph.TwitterFollowGraph;
-import org.apache.streams.example.graph.TwitterFollowGraphConfiguration;
-import org.junit.Before;
-import org.junit.Test;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.InputStream;
-import java.util.Properties;
-
-import static org.junit.Assert.assertFalse;
-import static org.junit.Assert.assertNotEquals;
-
-/**
- * Example stream that populates elasticsearch with activities from twitter userstream in real-time
- */
-public class TwitterFollowGraphIT {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(TwitterFollowGraphIT.class);
-
- protected TwitterFollowGraphConfiguration testConfiguration;
-
- private int count = 0;
-
- @Before
- public void prepareTest() throws Exception {
-
- Config reference = ConfigFactory.load();
- File conf_file = new File("target/test-classes/TwitterFollowGraphIT.conf");
- assert(conf_file.exists());
- Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
- Properties graph_properties = new Properties();
- InputStream graph_stream = new FileInputStream("graph.properties");
- graph_properties.load(graph_stream);
- Config graphProps = ConfigFactory.parseProperties(graph_properties);
- Config typesafe = testResourceConfig.withFallback(graphProps).withFallback(reference).resolve();
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
- testConfiguration = new ComponentConfigurator<>(TwitterFollowGraphConfiguration.class).detectConfiguration(typesafe);
-
- }
-
- @Test
- public void testTwitterFollowGraph() throws Exception {
-
- TwitterFollowGraph stream = new TwitterFollowGraph(testConfiguration);
-
- stream.run();
-
- }
-
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-graph/src/test/resources/TwitterFollowGraphIT.conf
----------------------------------------------------------------------
diff --git a/local/twitter-follow-graph/src/test/resources/TwitterFollowGraphIT.conf b/local/twitter-follow-graph/src/test/resources/TwitterFollowGraphIT.conf
deleted file mode 100644
index ecd4fd4..0000000
--- a/local/twitter-follow-graph/src/test/resources/TwitterFollowGraphIT.conf
+++ /dev/null
@@ -1,28 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements. See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership. The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied. See the License for the
-# specific language governing permissions and limitations
-# under the License.
-twitter {
- endpoint = "friends"
- info = [
- 18055613
- ]
- twitter.max_items = 1000
-}
-graph {
- hostname = ${graph.http.host}
- port = ${graph.http.port}
- type = "neo4j"
- graph = "data"
-}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/README.md
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/README.md b/local/twitter-follow-neo4j/README.md
new file mode 100644
index 0000000..3e63a53
--- /dev/null
+++ b/local/twitter-follow-neo4j/README.md
@@ -0,0 +1,8 @@
+Apache Streams (incubating)
+Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
+--------------------------------------------------------------------------------
+
+org.apache.streams:twitter-follow-graph
+=======================================
+
+[README.md](src/site/markdown/index.md "README")
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/dist/twitter-follow-graph-jar-with-dependencies.jar
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/dist/twitter-follow-graph-jar-with-dependencies.jar b/local/twitter-follow-neo4j/dist/twitter-follow-graph-jar-with-dependencies.jar
new file mode 100644
index 0000000..758e5cf
Binary files /dev/null and b/local/twitter-follow-neo4j/dist/twitter-follow-graph-jar-with-dependencies.jar differ
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/pom.xml
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/pom.xml b/local/twitter-follow-neo4j/pom.xml
new file mode 100644
index 0000000..e644c3c
--- /dev/null
+++ b/local/twitter-follow-neo4j/pom.xml
@@ -0,0 +1,316 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <parent>
+ <groupId>org.apache.streams</groupId>
+ <artifactId>streams-examples-local</artifactId>
+ <version>0.4-incubating-SNAPSHOT</version>
+ <relativePath>..</relativePath>
+ </parent>
+ <modelVersion>4.0.0</modelVersion>
+
+ <artifactId>twitter-follow-graph</artifactId>
+ <name>twitter-follow-graph</name>
+
+ <description>
+ Collects friend or follower connections for a set of twitter users to build a graph database in neo4j.
+ </description>
+
+ <properties>
+ <docker.repo>apachestreams</docker.repo>
+ </properties>
+
+ <dependencies>
+ <dependency>
+ <groupId>com.typesafe</groupId>
+ <artifactId>config</artifactId>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.streams</groupId>
+ <artifactId>streams-core</artifactId>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.streams</groupId>
+ <artifactId>streams-config</artifactId>
+ <version>0.4-incubating-SNAPSHOT</version>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.streams</groupId>
+ <artifactId>streams-runtime-local</artifactId>
+ <version>0.4-incubating-SNAPSHOT</version>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.streams</groupId>
+ <artifactId>streams-provider-twitter</artifactId>
+ <version>0.4-incubating-SNAPSHOT</version>
+ <exclusions>
+ <exclusion>
+ <groupId>commons-logging</groupId>
+ <artifactId>commons-logging</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.streams</groupId>
+ <artifactId>streams-persist-graph</artifactId>
+ <version>0.4-incubating-SNAPSHOT</version>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.streams</groupId>
+ <artifactId>streams-pojo</artifactId>
+ <version>0.4-incubating-SNAPSHOT</version>
+ <type>test-jar</type>
+ </dependency>
+ <dependency>
+ <groupId>org.slf4j</groupId>
+ <artifactId>log4j-over-slf4j</artifactId>
+ <version>${slf4j.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.slf4j</groupId>
+ <artifactId>jcl-over-slf4j</artifactId>
+ <version>${slf4j.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.slf4j</groupId>
+ <artifactId>jul-to-slf4j</artifactId>
+ <version>${slf4j.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>ch.qos.logback</groupId>
+ <artifactId>logback-classic</artifactId>
+ <version>${logback.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>ch.qos.logback</groupId>
+ <artifactId>logback-core</artifactId>
+ <version>${logback.version}</version>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <sourceDirectory>src/main/java</sourceDirectory>
+ <testSourceDirectory>src/test/java</testSourceDirectory>
+ <resources>
+ <resource>
+ <directory>src/main/resources</directory>
+ </resource>
+ </resources>
+ <testResources>
+ <testResource>
+ <directory>src/test/resources</directory>
+ </testResource>
+ </testResources>
+ <plugins>
+ <!-- This binary runs with logback -->
+ <!-- Keep log4j out -->
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-enforcer-plugin</artifactId>
+ <version>1.3.1</version>
+ <executions>
+ <execution>
+ <id>enforce-banned-dependencies</id>
+ <goals>
+ <goal>enforce</goal>
+ </goals>
+ <configuration>
+ <rules>
+ <bannedDependencies>
+ <excludes>
+ <exclude>org.slf4j:slf4j-log4j12</exclude>
+ <exclude>org.slf4j:slf4j-jcl</exclude>
+ <exclude>org.slf4j:slf4j-jdk14</exclude>
+ <exclude>org.log4j:log4j</exclude>
+ <exclude>commons-logging:commons-logging</exclude>
+ </excludes>
+ </bannedDependencies>
+ </rules>
+ <fail>true</fail>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ <plugin>
+ <artifactId>maven-clean-plugin</artifactId>
+ <configuration>
+ <filesets>
+ <fileset>
+ <directory>data</directory>
+ <followSymlinks>false</followSymlinks>
+ </fileset>
+ </filesets>
+ </configuration>
+ </plugin>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-shade-plugin</artifactId>
+ </plugin>
+ <plugin>
+ <groupId>org.jsonschema2pojo</groupId>
+ <artifactId>jsonschema2pojo-maven-plugin</artifactId>
+ <version>0.4.6</version>
+ <configuration>
+ <addCompileSourceRoot>true</addCompileSourceRoot>
+ <generateBuilders>true</generateBuilders>
+ <sourcePaths>
+ <sourcePath>src/main/jsonschema</sourcePath>
+ </sourcePaths>
+ <outputDirectory>target/generated-sources/jsonschema2pojo</outputDirectory>
+ <targetPackage>org.apache.streams.example.elasticsearch</targetPackage>
+ <useJodaDates>false</useJodaDates>
+ </configuration>
+ <executions>
+ <execution>
+ <goals>
+ <goal>generate</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+ <plugin>
+ <groupId>org.codehaus.mojo</groupId>
+ <artifactId>build-helper-maven-plugin</artifactId>
+ <executions>
+ <execution>
+ <id>add-source</id>
+ <phase>generate-sources</phase>
+ <goals>
+ <goal>add-source</goal>
+ </goals>
+ <configuration>
+ <sources>
+ <source>target/generated-sources/jsonschema2pojo</source>
+ </sources>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-dependency-plugin</artifactId>
+ <version>2.4</version>
+ <executions>
+ <execution>
+ <id>resource-dependencies</id>
+ <phase>process-test-resources</phase>
+ <goals>
+ <goal>unpack-dependencies</goal>
+ </goals>
+ <configuration>
+ <includeArtifactIds>streams-pojo</includeArtifactIds>
+ <includes>**/*.json</includes>
+ <outputDirectory>${project.build.directory}/test-classes</outputDirectory>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-failsafe-plugin</artifactId>
+ <version>2.12.4</version>
+ <executions>
+ <execution>
+ <id>integration-tests</id>
+ <goals>
+ <goal>integration-test</goal>
+ <goal>verify</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </build>
+
+ <profiles>
+ <profile>
+ <id>dockerITs</id>
+ <activation>
+ <activeByDefault>false</activeByDefault>
+ <property>
+ <name>skipITs</name>
+ <value>false</value>
+ </property>
+ </activation>
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>io.fabric8</groupId>
+ <artifactId>docker-maven-plugin</artifactId>
+ <version>${docker.plugin.version}</version>
+ <configuration combine.self="override">
+ <watchInterval>500</watchInterval>
+ <logDate>default</logDate>
+ <verbose>true</verbose>
+ <autoPull>on</autoPull>
+ <images>
+ 
+
+ </images>
+ </configuration>
+
+ </plugin>
+
+ </plugins>
+ </build>
+
+ </profile>
+ </profiles>
+
+</project>
\ No newline at end of file
[2/9] incubator-streams-examples git commit: normalize package names
in streams-examples/local
Posted by sb...@apache.org.
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/main/java/org/apache/streams/example/TwitterFollowNeo4j.java
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/main/java/org/apache/streams/example/TwitterFollowNeo4j.java b/local/twitter-follow-neo4j/src/main/java/org/apache/streams/example/TwitterFollowNeo4j.java
new file mode 100644
index 0000000..34ac8c4
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/main/java/org/apache/streams/example/TwitterFollowNeo4j.java
@@ -0,0 +1,93 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example;
+
+import com.google.common.collect.Lists;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.converter.ActivityConverterProcessor;
+import org.apache.streams.converter.ActivityConverterProcessorConfiguration;
+import org.apache.streams.converter.TypeConverterProcessor;
+import org.apache.streams.core.StreamBuilder;
+import org.apache.streams.data.ActivityConverter;
+import org.apache.streams.data.DocumentClassifier;
+import org.apache.streams.example.TwitterFollowNeo4jConfiguration;
+import org.apache.streams.graph.GraphHttpConfiguration;
+import org.apache.streams.graph.GraphHttpPersistWriter;
+import org.apache.streams.local.builders.LocalStreamBuilder;
+import org.apache.streams.twitter.TwitterFollowingConfiguration;
+import org.apache.streams.twitter.converter.TwitterFollowActivityConverter;
+import org.apache.streams.twitter.provider.TwitterFollowingProvider;
+import org.apache.streams.twitter.converter.TwitterDocumentClassifier;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Collects friend and follow connections for a set of twitter users and builds a graph
+ * database in neo4j.
+ */
+public class TwitterFollowNeo4j implements Runnable {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(TwitterFollowNeo4j.class);
+
+ TwitterFollowNeo4jConfiguration config;
+
+ public TwitterFollowNeo4j() {
+ this(new ComponentConfigurator<>(TwitterFollowNeo4jConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
+ }
+
+ public TwitterFollowNeo4j(TwitterFollowNeo4jConfiguration config) {
+ this.config = config;
+ }
+
+ public void run() {
+
+ TwitterFollowingConfiguration twitterFollowingConfiguration = config.getTwitter();
+ TwitterFollowingProvider followingProvider = new TwitterFollowingProvider(twitterFollowingConfiguration);
+ TypeConverterProcessor converter = new TypeConverterProcessor(String.class);
+
+ ActivityConverterProcessorConfiguration activityConverterProcessorConfiguration =
+ new ActivityConverterProcessorConfiguration()
+ .withClassifiers(Lists.newArrayList((DocumentClassifier) new TwitterDocumentClassifier()))
+ .withConverters(Lists.newArrayList((ActivityConverter) new TwitterFollowActivityConverter()));
+ ActivityConverterProcessor activity = new ActivityConverterProcessor(activityConverterProcessorConfiguration);
+
+ GraphHttpConfiguration graphWriterConfiguration = config.getGraph();
+ GraphHttpPersistWriter graphPersistWriter = new GraphHttpPersistWriter(graphWriterConfiguration);
+
+ StreamBuilder builder = new LocalStreamBuilder();
+ builder.newPerpetualStream(TwitterFollowingProvider.STREAMS_ID, followingProvider);
+ builder.addStreamsProcessor("converter", converter, 1, TwitterFollowingProvider.STREAMS_ID);
+ builder.addStreamsProcessor("activity", activity, 1, "converter");
+ builder.addStreamsPersistWriter("graph", graphPersistWriter, 1, "activity");
+
+ builder.start();
+ }
+
+ public static void main(String[] args) {
+
+ LOGGER.info(StreamsConfigurator.config.toString());
+
+ TwitterFollowNeo4j stream = new TwitterFollowNeo4j();
+
+ stream.run();
+
+ }
+
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/main/jsonschema/TwitterFollowNeo4jConfiguration.json
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/main/jsonschema/TwitterFollowNeo4jConfiguration.json b/local/twitter-follow-neo4j/src/main/jsonschema/TwitterFollowNeo4jConfiguration.json
new file mode 100644
index 0000000..ffbd39d
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/main/jsonschema/TwitterFollowNeo4jConfiguration.json
@@ -0,0 +1,13 @@
+{
+ "$schema": "http://json-schema.org/draft-03/schema",
+ "$license": [
+ "http://www.apache.org/licenses/LICENSE-2.0"
+ ],
+ "type": "object",
+ "javaType" : "org.apache.streams.example.TwitterFollowNeo4jConfiguration",
+ "javaInterfaces": ["java.io.Serializable"],
+ "properties": {
+ "twitter": { "javaType": "org.apache.streams.twitter.TwitterFollowingConfiguration", "type": "object", "required": true },
+ "graph": { "javaType": "org.apache.streams.graph.GraphHttpConfiguration", "type": "object", "required": true }
+ }
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/main/resources/TwitterFollowNeo4j.dot
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/main/resources/TwitterFollowNeo4j.dot b/local/twitter-follow-neo4j/src/main/resources/TwitterFollowNeo4j.dot
new file mode 100644
index 0000000..2d9e495
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/main/resources/TwitterFollowNeo4j.dot
@@ -0,0 +1,39 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+ digraph g {
+
+ //providers
+ TwitterFollowingProvider [label="TwitterFollowingProvider",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-provider-twitter/src/main/java/org/apache/streams/twitter/provider/TwitterFollowingProvider.java"];
+
+ //processors
+ TypeConverterProcessor [label="TypeConverterProcessor",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-components/streams-converters/src/main/java/org/apache/streams/converters/TypeConverterProcessor.java"];
+ ActivityConverterProcessor [label="ActivityConverterProcessor",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-components/streams-converters/src/main/java/org/apache/streams/converters/ActivityConverterProcessor.java"];
+
+ //persisters
+ GraphPersistWriter [label="GraphPersistWriter",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-persist-graph/src/main/java/org/apache/streams/graph/GraphPersistWriter.java"];
+
+ //data
+ destination [label="http://{host}:{port}/db/data",shape=box];
+
+ //stream
+ TwitterFollowingProvider -> TypeConverterProcessor [label="Follow",URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-provider-twitter/src/main/jsonschema/com/twitter/Follow.java"];
+ TypeConverterProcessor -> ActivityConverterProcessor [label="String"];
+ ActivityConverterProcessor -> GraphPersistWriter [label="Activity",URL="https://github.com/apache/incubator-streams/blob/master/streams-pojo/src/main/jsonschema/org/apache/streams/pojo/json/activity.json"];
+ GraphPersistWriter -> destination
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/site/markdown/TwitterFollowNeo4j.md
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/markdown/TwitterFollowNeo4j.md b/local/twitter-follow-neo4j/src/site/markdown/TwitterFollowNeo4j.md
new file mode 100644
index 0000000..936efb4
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/site/markdown/TwitterFollowNeo4j.md
@@ -0,0 +1,33 @@
+### TwitterFollowNeo4j
+
+#### Description:
+
+Collects friend or follower connections for a set of twitter users to build a graph database in neo4j.
+
+#### Configuration:
+
+[TwitterFollowNeo4jIT.conf](TwitterFollowNeo4jIT.conf "TwitterFollowNeo4jIT.conf" )
+
+#### Run (SBT):
+
+ sbtx -210 -sbt-create
+ set resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
+ set libraryDependencies += "org.apache.streams" % "twitter-follow-neo4j" % "0.4-incubating-SNAPSHOT"
+ set fork := true
+ set javaOptions +="-Dconfig.file=application.conf"
+ run org.apache.streams.example.graph.TwitterFollowNeo4j
+
+#### Run (Docker):
+
+ docker run apachestreams/twitter-follow-neo4j java -cp twitter-follow-neo4j-jar-with-dependencies.jar org.apache.streams.example.TwitterFollowNeo4j
+
+#### Specification:
+
+[TwitterFollowNeo4j.dot](TwitterFollowNeo4j.dot "TwitterFollowNeo4j.dot" )
+
+#### Diagram:
+
+![TwitterFollowNeo4j.dot.svg](./TwitterFollowNeo4j.dot.svg)
+
+
+###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/markdown/index.md b/local/twitter-follow-neo4j/src/site/markdown/index.md
new file mode 100644
index 0000000..3efdc5b
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/site/markdown/index.md
@@ -0,0 +1,42 @@
+### twitter-follow-neo4j
+
+#### Requirements:
+ - Authorized Twitter API credentials
+ - A running Neo4J 3.0.0+ instance
+
+#### Streams:
+
+<a href="TwitterFollowNeo4j.html" target="_self">TwitterFollowNeo4j</a>
+
+#### Build:
+
+ mvn clean package verify
+
+#### Test:
+
+Create a local file `application.conf` with valid twitter credentials
+
+ twitter {
+ oauth {
+ consumerKey = ""
+ consumerSecret = ""
+ accessToken = ""
+ accessTokenSecret = ""
+ }
+ }
+
+Start up neo4j with docker:
+
+ mvn -PdockerITs docker:start
+
+Build with integration testing enabled, using your credentials
+
+ mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=`pwd`/application.conf"
+
+Shutdown neo4j when finished:
+
+ mvn -PdockerITs docker:stop
+
+[JavaDocs](apidocs/index.html "JavaDocs")
+
+###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/site/resources/TwitterFollowGraph.dot
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/resources/TwitterFollowGraph.dot b/local/twitter-follow-neo4j/src/site/resources/TwitterFollowGraph.dot
new file mode 100644
index 0000000..2d9e495
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/site/resources/TwitterFollowGraph.dot
@@ -0,0 +1,39 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+ digraph g {
+
+ //providers
+ TwitterFollowingProvider [label="TwitterFollowingProvider",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-provider-twitter/src/main/java/org/apache/streams/twitter/provider/TwitterFollowingProvider.java"];
+
+ //processors
+ TypeConverterProcessor [label="TypeConverterProcessor",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-components/streams-converters/src/main/java/org/apache/streams/converters/TypeConverterProcessor.java"];
+ ActivityConverterProcessor [label="ActivityConverterProcessor",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-components/streams-converters/src/main/java/org/apache/streams/converters/ActivityConverterProcessor.java"];
+
+ //persisters
+ GraphPersistWriter [label="GraphPersistWriter",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-persist-graph/src/main/java/org/apache/streams/graph/GraphPersistWriter.java"];
+
+ //data
+ destination [label="http://{host}:{port}/db/data",shape=box];
+
+ //stream
+ TwitterFollowingProvider -> TypeConverterProcessor [label="Follow",URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-provider-twitter/src/main/jsonschema/com/twitter/Follow.java"];
+ TypeConverterProcessor -> ActivityConverterProcessor [label="String"];
+ ActivityConverterProcessor -> GraphPersistWriter [label="Activity",URL="https://github.com/apache/incubator-streams/blob/master/streams-pojo/src/main/jsonschema/org/apache/streams/pojo/json/activity.json"];
+ GraphPersistWriter -> destination
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/site/resources/TwitterFollowGraphConfiguration.json
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/resources/TwitterFollowGraphConfiguration.json b/local/twitter-follow-neo4j/src/site/resources/TwitterFollowGraphConfiguration.json
new file mode 100644
index 0000000..6025640
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/site/resources/TwitterFollowGraphConfiguration.json
@@ -0,0 +1,13 @@
+{
+ "$schema": "http://json-schema.org/draft-03/schema",
+ "$license": [
+ "http://www.apache.org/licenses/LICENSE-2.0"
+ ],
+ "type": "object",
+ "javaType" : "org.apache.streams.example.graph.TwitterFollowNeo4jConfiguration",
+ "javaInterfaces": ["java.io.Serializable"],
+ "properties": {
+ "twitter": { "javaType": "org.apache.streams.twitter.TwitterFollowingConfiguration", "type": "object", "required": true },
+ "graph": { "javaType": "org.apache.streams.graph.GraphHttpConfiguration", "type": "object", "required": true }
+ }
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/site/resources/TwitterFollowNeo4j.dot
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/resources/TwitterFollowNeo4j.dot b/local/twitter-follow-neo4j/src/site/resources/TwitterFollowNeo4j.dot
new file mode 100644
index 0000000..2d9e495
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/site/resources/TwitterFollowNeo4j.dot
@@ -0,0 +1,39 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+ digraph g {
+
+ //providers
+ TwitterFollowingProvider [label="TwitterFollowingProvider",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-provider-twitter/src/main/java/org/apache/streams/twitter/provider/TwitterFollowingProvider.java"];
+
+ //processors
+ TypeConverterProcessor [label="TypeConverterProcessor",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-components/streams-converters/src/main/java/org/apache/streams/converters/TypeConverterProcessor.java"];
+ ActivityConverterProcessor [label="ActivityConverterProcessor",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-components/streams-converters/src/main/java/org/apache/streams/converters/ActivityConverterProcessor.java"];
+
+ //persisters
+ GraphPersistWriter [label="GraphPersistWriter",shape=ellipse,URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-persist-graph/src/main/java/org/apache/streams/graph/GraphPersistWriter.java"];
+
+ //data
+ destination [label="http://{host}:{port}/db/data",shape=box];
+
+ //stream
+ TwitterFollowingProvider -> TypeConverterProcessor [label="Follow",URL="https://github.com/apache/incubator-streams/blob/master/streams-contrib/streams-provider-twitter/src/main/jsonschema/com/twitter/Follow.java"];
+ TypeConverterProcessor -> ActivityConverterProcessor [label="String"];
+ ActivityConverterProcessor -> GraphPersistWriter [label="Activity",URL="https://github.com/apache/incubator-streams/blob/master/streams-pojo/src/main/jsonschema/org/apache/streams/pojo/json/activity.json"];
+ GraphPersistWriter -> destination
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/site/resources/TwitterFollowNeo4jConfiguration.json
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/resources/TwitterFollowNeo4jConfiguration.json b/local/twitter-follow-neo4j/src/site/resources/TwitterFollowNeo4jConfiguration.json
new file mode 100644
index 0000000..ffbd39d
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/site/resources/TwitterFollowNeo4jConfiguration.json
@@ -0,0 +1,13 @@
+{
+ "$schema": "http://json-schema.org/draft-03/schema",
+ "$license": [
+ "http://www.apache.org/licenses/LICENSE-2.0"
+ ],
+ "type": "object",
+ "javaType" : "org.apache.streams.example.TwitterFollowNeo4jConfiguration",
+ "javaInterfaces": ["java.io.Serializable"],
+ "properties": {
+ "twitter": { "javaType": "org.apache.streams.twitter.TwitterFollowingConfiguration", "type": "object", "required": true },
+ "graph": { "javaType": "org.apache.streams.graph.GraphHttpConfiguration", "type": "object", "required": true }
+ }
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/site/site.xml
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/site.xml b/local/twitter-follow-neo4j/src/site/site.xml
new file mode 100644
index 0000000..a25bae0
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/site/site.xml
@@ -0,0 +1,45 @@
+<?xml version="1.0" encoding="ISO-8859-1"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one
+ ~ or more contributor license agreements. See the NOTICE file
+ ~ distributed with this work for additional information
+ ~ regarding copyright ownership. The ASF licenses this file
+ ~ to you under the Apache License, Version 2.0 (the
+ ~ "License"); you may not use this file except in compliance
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing,
+ ~ software distributed under the License is distributed on an
+ ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ ~ KIND, either express or implied. See the License for the
+ ~ specific language governing permissions and limitations
+ ~ under the License.
+ -->
+<project>
+ <custom>
+ <fluidoSkin>
+ <topBarEnabled>false</topBarEnabled>
+ <navBarStyle>navbar-inverse</navBarStyle>
+ <sideBarEnabled>true</sideBarEnabled>
+ <!--<gitHub>-->
+ <!--<projectId>apache/incubator-streams-examples</projectId>-->
+ <!--<ribbonOrientation>right</ribbonOrientation>-->
+ <!--<ribbonColor>black</ribbonColor>-->
+ <!--</gitHub>-->
+ <!--<twitter>-->
+ <!--<user>ApacheStreams</user>-->
+ <!--<showUser>true</showUser>-->
+ <!--<showFollowers>true</showFollowers>-->
+ <!--</twitter>-->
+ </fluidoSkin>
+ </custom>
+ <body>
+ <menu name="Configuration">
+ <item name="Neo4j" href="../../services/neo4j.html"/>
+ </menu>
+ <menu name="Credentials">
+ <item name="Twitter" href="../../credentials/twitter.html"/>
+ </menu>
+ </body>
+</project>
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/test/java/org/apache/streams/example/test/TwitterFollowNeo4jIT.java
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/test/java/org/apache/streams/example/test/TwitterFollowNeo4jIT.java b/local/twitter-follow-neo4j/src/test/java/org/apache/streams/example/test/TwitterFollowNeo4jIT.java
new file mode 100644
index 0000000..2813b08
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/test/java/org/apache/streams/example/test/TwitterFollowNeo4jIT.java
@@ -0,0 +1,79 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example.test;
+
+import com.typesafe.config.Config;
+import com.typesafe.config.ConfigFactory;
+import com.typesafe.config.ConfigParseOptions;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfiguration;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.example.TwitterFollowNeo4j;
+import org.apache.streams.example.TwitterFollowNeo4jConfiguration;
+import org.junit.Before;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.InputStream;
+import java.util.Properties;
+
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertNotEquals;
+
+/**
+ * Example stream that populates elasticsearch with activities from twitter userstream in real-time
+ */
+public class TwitterFollowNeo4jIT {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(TwitterFollowNeo4jIT.class);
+
+ protected TwitterFollowNeo4jConfiguration testConfiguration;
+
+ private int count = 0;
+
+ @Before
+ public void prepareTest() throws Exception {
+
+ Config reference = ConfigFactory.load();
+ File conf_file = new File("target/test-classes/TwitterFollowGraphIT.conf");
+ assert(conf_file.exists());
+ Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
+ Properties graph_properties = new Properties();
+ InputStream graph_stream = new FileInputStream("neo4j.properties");
+ graph_properties.load(graph_stream);
+ Config graphProps = ConfigFactory.parseProperties(graph_properties);
+ Config typesafe = testResourceConfig.withFallback(graphProps).withFallback(reference).resolve();
+ StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
+ testConfiguration = new ComponentConfigurator<>(TwitterFollowNeo4jConfiguration.class).detectConfiguration(typesafe);
+
+ }
+
+ @Test
+ public void testTwitterFollowGraph() throws Exception {
+
+ TwitterFollowNeo4j stream = new TwitterFollowNeo4j(testConfiguration);
+
+ stream.run();
+
+ }
+
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-follow-neo4j/src/test/resources/TwitterFollowGraphIT.conf
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/test/resources/TwitterFollowGraphIT.conf b/local/twitter-follow-neo4j/src/test/resources/TwitterFollowGraphIT.conf
new file mode 100644
index 0000000..d4b4aeb
--- /dev/null
+++ b/local/twitter-follow-neo4j/src/test/resources/TwitterFollowGraphIT.conf
@@ -0,0 +1,28 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+twitter {
+ endpoint = "friends"
+ info = [
+ 18055613
+ ]
+ twitter.max_items = 1000
+}
+graph {
+ hostname = ${neo4j.http.host}
+ port = ${neo4j.http.port}
+ type = "neo4j"
+ graph = "data"
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-history-elasticsearch/src/main/java/org/apache/streams/example/TwitterHistoryElasticsearch.java
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/main/java/org/apache/streams/example/TwitterHistoryElasticsearch.java b/local/twitter-history-elasticsearch/src/main/java/org/apache/streams/example/TwitterHistoryElasticsearch.java
new file mode 100644
index 0000000..7d87f36
--- /dev/null
+++ b/local/twitter-history-elasticsearch/src/main/java/org/apache/streams/example/TwitterHistoryElasticsearch.java
@@ -0,0 +1,81 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.converter.ActivityConverterProcessor;
+import org.apache.streams.core.StreamBuilder;
+import org.apache.streams.elasticsearch.ElasticsearchPersistWriter;
+import org.apache.streams.local.builders.LocalStreamBuilder;
+import org.apache.streams.twitter.provider.TwitterTimelineProvider;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Retrieves as many posts from a known list of users as twitter API allows.
+ *
+ * Converts them to activities, and writes them in activity format to Elasticsearch.
+ */
+
+public class TwitterHistoryElasticsearch implements Runnable {
+
+ public final static String STREAMS_ID = "TwitterHistoryElasticsearch";
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(TwitterHistoryElasticsearch.class);
+
+ private static final ObjectMapper mapper = new ObjectMapper();
+
+ TwitterHistoryElasticsearchConfiguration config;
+
+ public TwitterHistoryElasticsearch() {
+ this(new ComponentConfigurator<>(TwitterHistoryElasticsearchConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
+
+ }
+
+ public TwitterHistoryElasticsearch(TwitterHistoryElasticsearchConfiguration config) {
+ this.config = config;
+ }
+
+ public static void main(String[] args)
+ {
+ LOGGER.info(StreamsConfigurator.config.toString());
+
+ TwitterHistoryElasticsearch history = new TwitterHistoryElasticsearch();
+
+ new Thread(history).start();
+
+ }
+
+
+ public void run() {
+
+ TwitterTimelineProvider provider = new TwitterTimelineProvider(config.getTwitter());
+ ActivityConverterProcessor converter = new ActivityConverterProcessor();
+ ElasticsearchPersistWriter writer = new ElasticsearchPersistWriter(config.getElasticsearch());
+
+ StreamBuilder builder = new LocalStreamBuilder(500);
+
+ builder.newPerpetualStream("provider", provider);
+ builder.addStreamsProcessor("converter", converter, 2, "provider");
+ builder.addStreamsPersistWriter("writer", writer, 1, "converter");
+ builder.start();
+ }
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-history-elasticsearch/src/main/java/org/apache/streams/example/twitter/TwitterHistoryElasticsearch.java
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/main/java/org/apache/streams/example/twitter/TwitterHistoryElasticsearch.java b/local/twitter-history-elasticsearch/src/main/java/org/apache/streams/example/twitter/TwitterHistoryElasticsearch.java
deleted file mode 100644
index 090b9ed..0000000
--- a/local/twitter-history-elasticsearch/src/main/java/org/apache/streams/example/twitter/TwitterHistoryElasticsearch.java
+++ /dev/null
@@ -1,90 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.example.twitter;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.fasterxml.jackson.databind.node.ObjectNode;
-import com.typesafe.config.Config;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.converter.ActivityConverterProcessor;
-import org.apache.streams.core.StreamBuilder;
-import org.apache.streams.elasticsearch.ElasticsearchConfigurator;
-import org.apache.streams.elasticsearch.ElasticsearchPersistWriter;
-import org.apache.streams.elasticsearch.ElasticsearchWriterConfiguration;
-import org.apache.streams.local.builders.LocalStreamBuilder;
-import org.apache.streams.pojo.json.Activity;
-import org.apache.streams.twitter.TwitterStreamConfiguration;
-import org.apache.streams.twitter.TwitterUserInformationConfiguration;
-import org.apache.streams.twitter.processor.TwitterTypeConverter;
-import org.apache.streams.twitter.provider.TwitterConfigurator;
-import org.apache.streams.twitter.provider.TwitterTimelineProvider;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-/**
- * Retrieves as many posts from a known list of users as twitter API allows.
- *
- * Converts them to activities, and writes them in activity format to Elasticsearch.
- */
-
-public class TwitterHistoryElasticsearch implements Runnable {
-
- public final static String STREAMS_ID = "TwitterHistoryElasticsearch";
-
- private final static Logger LOGGER = LoggerFactory.getLogger(TwitterHistoryElasticsearch.class);
-
- private static final ObjectMapper mapper = new ObjectMapper();
-
- TwitterHistoryElasticsearchConfiguration config;
-
- public TwitterHistoryElasticsearch() {
- this(new ComponentConfigurator<>(TwitterHistoryElasticsearchConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
-
- }
-
- public TwitterHistoryElasticsearch(TwitterHistoryElasticsearchConfiguration config) {
- this.config = config;
- }
-
- public static void main(String[] args)
- {
- LOGGER.info(StreamsConfigurator.config.toString());
-
- TwitterHistoryElasticsearch history = new TwitterHistoryElasticsearch();
-
- new Thread(history).start();
-
- }
-
-
- public void run() {
-
- TwitterTimelineProvider provider = new TwitterTimelineProvider(config.getTwitter());
- ActivityConverterProcessor converter = new ActivityConverterProcessor();
- ElasticsearchPersistWriter writer = new ElasticsearchPersistWriter(config.getElasticsearch());
-
- StreamBuilder builder = new LocalStreamBuilder(500);
-
- builder.newPerpetualStream("provider", provider);
- builder.addStreamsProcessor("converter", converter, 2, "provider");
- builder.addStreamsPersistWriter("writer", writer, 1, "converter");
- builder.start();
- }
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-history-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterHistoryElasticsearchConfiguration.json
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterHistoryElasticsearchConfiguration.json b/local/twitter-history-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterHistoryElasticsearchConfiguration.json
index ea9b165..eaf8028 100644
--- a/local/twitter-history-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterHistoryElasticsearchConfiguration.json
+++ b/local/twitter-history-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterHistoryElasticsearchConfiguration.json
@@ -4,7 +4,7 @@
"http://www.apache.org/licenses/LICENSE-2.0"
],
"type": "object",
- "javaType" : "org.apache.streams.example.twitter.TwitterHistoryElasticsearchConfiguration",
+ "javaType" : "org.apache.streams.example.TwitterHistoryElasticsearchConfiguration",
"javaInterfaces": ["java.io.Serializable"],
"properties": {
"twitter": { "javaType": "org.apache.streams.twitter.TwitterUserInformationConfiguration", "type": "object", "required": true },
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-history-elasticsearch/src/test/java/org/apache/streams/example/test/TwitterHistoryElasticsearchIT.java
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/test/java/org/apache/streams/example/test/TwitterHistoryElasticsearchIT.java b/local/twitter-history-elasticsearch/src/test/java/org/apache/streams/example/test/TwitterHistoryElasticsearchIT.java
new file mode 100644
index 0000000..b0c9155
--- /dev/null
+++ b/local/twitter-history-elasticsearch/src/test/java/org/apache/streams/example/test/TwitterHistoryElasticsearchIT.java
@@ -0,0 +1,108 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example.test;
+
+import com.typesafe.config.Config;
+import com.typesafe.config.ConfigFactory;
+import com.typesafe.config.ConfigParseOptions;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfiguration;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.ElasticsearchClientManager;
+import org.apache.streams.example.TwitterHistoryElasticsearch;
+import org.apache.streams.example.TwitterHistoryElasticsearchConfiguration;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
+import org.elasticsearch.action.search.SearchRequestBuilder;
+import org.elasticsearch.action.search.SearchResponse;
+import org.elasticsearch.client.Client;
+import org.elasticsearch.client.Requests;
+import org.elasticsearch.cluster.health.ClusterHealthStatus;
+import org.junit.Before;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.InputStream;
+import java.util.Properties;
+
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertNotEquals;
+
+/**
+ * Example stream that populates elasticsearch with activities from twitter userstream in real-time
+ */
+public class TwitterHistoryElasticsearchIT {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(TwitterHistoryElasticsearchIT.class);
+
+ protected TwitterHistoryElasticsearchConfiguration testConfiguration;
+ protected Client testClient;
+
+ private int count = 0;
+
+ @Before
+ public void prepareTest() throws Exception {
+
+ Config reference = ConfigFactory.load();
+ File conf_file = new File("target/test-classes/TwitterHistoryElasticsearchIT.conf");
+ assert(conf_file.exists());
+ Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
+ Properties es_properties = new Properties();
+ InputStream es_stream = new FileInputStream("elasticsearch.properties");
+ es_properties.load(es_stream);
+ Config esProps = ConfigFactory.parseProperties(es_properties);
+ Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
+ StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
+ testConfiguration = new ComponentConfigurator<>(TwitterHistoryElasticsearchConfiguration.class).detectConfiguration(typesafe);
+ testClient = new ElasticsearchClientManager(testConfiguration.getElasticsearch()).getClient();
+
+ ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
+ ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
+ assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getElasticsearch().getIndex());
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ assertFalse(indicesExistsResponse.isExists());
+
+ }
+
+ @Test
+ public void testTwitterHistoryElasticsearch() throws Exception {
+
+ TwitterHistoryElasticsearch stream = new TwitterHistoryElasticsearch(testConfiguration);
+
+ stream.run();
+
+ // assert lines in file
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getElasticsearch().getIndex())
+ .setTypes(testConfiguration.getElasticsearch().getType());
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ count = (int)countResponse.getHits().getTotalHits();
+
+ assertNotEquals(count, 0);
+ }
+
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-history-elasticsearch/src/test/java/org/apache/streams/twitter/example/TwitterHistoryElasticsearchIT.java
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/test/java/org/apache/streams/twitter/example/TwitterHistoryElasticsearchIT.java b/local/twitter-history-elasticsearch/src/test/java/org/apache/streams/twitter/example/TwitterHistoryElasticsearchIT.java
deleted file mode 100644
index b8e1b64..0000000
--- a/local/twitter-history-elasticsearch/src/test/java/org/apache/streams/twitter/example/TwitterHistoryElasticsearchIT.java
+++ /dev/null
@@ -1,108 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.twitter.example;
-
-import com.typesafe.config.Config;
-import com.typesafe.config.ConfigFactory;
-import com.typesafe.config.ConfigParseOptions;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.elasticsearch.ElasticsearchClientManager;
-import org.apache.streams.example.twitter.TwitterHistoryElasticsearch;
-import org.apache.streams.example.twitter.TwitterHistoryElasticsearchConfiguration;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
-import org.elasticsearch.action.search.SearchRequestBuilder;
-import org.elasticsearch.action.search.SearchResponse;
-import org.elasticsearch.client.Client;
-import org.elasticsearch.client.Requests;
-import org.elasticsearch.cluster.health.ClusterHealthStatus;
-import org.junit.Before;
-import org.junit.Test;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.InputStream;
-import java.util.Properties;
-
-import static org.junit.Assert.assertFalse;
-import static org.junit.Assert.assertNotEquals;
-
-/**
- * Example stream that populates elasticsearch with activities from twitter userstream in real-time
- */
-public class TwitterHistoryElasticsearchIT {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(TwitterHistoryElasticsearchIT.class);
-
- protected TwitterHistoryElasticsearchConfiguration testConfiguration;
- protected Client testClient;
-
- private int count = 0;
-
- @Before
- public void prepareTest() throws Exception {
-
- Config reference = ConfigFactory.load();
- File conf_file = new File("target/test-classes/TwitterHistoryElasticsearchIT.conf");
- assert(conf_file.exists());
- Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
- Properties es_properties = new Properties();
- InputStream es_stream = new FileInputStream("elasticsearch.properties");
- es_properties.load(es_stream);
- Config esProps = ConfigFactory.parseProperties(es_properties);
- Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
- testConfiguration = new ComponentConfigurator<>(TwitterHistoryElasticsearchConfiguration.class).detectConfiguration(typesafe);
- testClient = new ElasticsearchClientManager(testConfiguration.getElasticsearch()).getClient();
-
- ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
- ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
- assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getElasticsearch().getIndex());
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- assertFalse(indicesExistsResponse.isExists());
-
- }
-
- @Test
- public void testTwitterHistoryElasticsearch() throws Exception {
-
- TwitterHistoryElasticsearch stream = new TwitterHistoryElasticsearch(testConfiguration);
-
- stream.run();
-
- // assert lines in file
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getElasticsearch().getIndex())
- .setTypes(testConfiguration.getElasticsearch().getType());
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- count = (int)countResponse.getHits().getTotalHits();
-
- assertNotEquals(count, 0);
- }
-
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-userstream-elasticsearch/src/main/java/org/apache/streams/example/TwitterUserstreamElasticsearch.java
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/main/java/org/apache/streams/example/TwitterUserstreamElasticsearch.java b/local/twitter-userstream-elasticsearch/src/main/java/org/apache/streams/example/TwitterUserstreamElasticsearch.java
new file mode 100644
index 0000000..f1e776a
--- /dev/null
+++ b/local/twitter-userstream-elasticsearch/src/main/java/org/apache/streams/example/TwitterUserstreamElasticsearch.java
@@ -0,0 +1,146 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+import com.google.common.collect.Sets;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.converter.ActivityConverterProcessor;
+import org.apache.streams.core.StreamsDatum;
+import org.apache.streams.core.StreamsProcessor;
+import org.apache.streams.elasticsearch.ElasticsearchPersistDeleter;
+import org.apache.streams.elasticsearch.ElasticsearchPersistWriter;
+import org.apache.streams.elasticsearch.ElasticsearchWriterConfiguration;
+import org.apache.streams.example.TwitterUserstreamElasticsearchConfiguration;
+import org.apache.streams.filters.VerbDefinitionDropFilter;
+import org.apache.streams.filters.VerbDefinitionKeepFilter;
+import org.apache.streams.local.builders.LocalStreamBuilder;
+import org.apache.streams.core.StreamBuilder;
+import org.apache.streams.pojo.json.Activity;
+import org.apache.streams.twitter.TwitterStreamConfiguration;
+import org.apache.streams.twitter.provider.TwitterStreamProvider;
+import org.apache.streams.verbs.ObjectCombination;
+import org.apache.streams.verbs.VerbDefinition;
+import org.elasticsearch.common.Strings;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.util.List;
+import java.util.Map;
+
+/**
+ * Example stream that populates elasticsearch with activities from twitter userstream in real-time
+ */
+public class TwitterUserstreamElasticsearch implements Runnable {
+
+ public final static String STREAMS_ID = "TwitterUserstreamElasticsearch";
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(TwitterUserstreamElasticsearch.class);
+
+ /* this pattern will match any/only deletes */
+ private static VerbDefinition deleteVerbDefinition =
+ new VerbDefinition()
+ .withValue("delete")
+ .withObjects(Lists.newArrayList(new ObjectCombination()));
+
+ TwitterUserstreamElasticsearchConfiguration config;
+
+ public TwitterUserstreamElasticsearch() {
+ this(new ComponentConfigurator<>(TwitterUserstreamElasticsearchConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
+
+ }
+
+ public TwitterUserstreamElasticsearch(TwitterUserstreamElasticsearchConfiguration config) {
+ this.config = config;
+ }
+
+ public static void main(String[] args)
+ {
+ LOGGER.info(StreamsConfigurator.config.toString());
+
+ TwitterUserstreamElasticsearch userstream = new TwitterUserstreamElasticsearch();
+ new Thread(userstream).start();
+
+ }
+
+ @Override
+ public void run() {
+
+ TwitterStreamConfiguration twitterStreamConfiguration = config.getTwitter();
+ ElasticsearchWriterConfiguration elasticsearchWriterConfiguration = config.getElasticsearch();
+
+ TwitterStreamProvider stream = new TwitterStreamProvider(twitterStreamConfiguration);
+ ActivityConverterProcessor converter = new ActivityConverterProcessor();
+ VerbDefinitionDropFilter noDeletesProcessor = new VerbDefinitionDropFilter(Sets.newHashSet(deleteVerbDefinition));
+ ElasticsearchPersistWriter writer = new ElasticsearchPersistWriter(elasticsearchWriterConfiguration);
+ VerbDefinitionKeepFilter deleteOnlyProcessor = new VerbDefinitionKeepFilter(Sets.newHashSet(deleteVerbDefinition));
+ SetDeleteIdProcessor setDeleteIdProcessor = new SetDeleteIdProcessor();
+ ElasticsearchPersistDeleter deleter = new ElasticsearchPersistDeleter(elasticsearchWriterConfiguration);
+
+ Map<String, Object> streamConfig = Maps.newHashMap();
+ streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 12 * 60 * 1000);
+ StreamBuilder builder = new LocalStreamBuilder(25, streamConfig);
+
+ builder.newPerpetualStream(TwitterStreamProvider.STREAMS_ID, stream);
+ builder.addStreamsProcessor("converter", converter, 2, TwitterStreamProvider.STREAMS_ID);
+ builder.addStreamsProcessor("NoDeletesProcessor", noDeletesProcessor, 1, "converter");
+ builder.addStreamsPersistWriter(ElasticsearchPersistWriter.STREAMS_ID, writer, 1, "NoDeletesProcessor");
+ builder.addStreamsProcessor("DeleteOnlyProcessor", deleteOnlyProcessor, 1, "converter");
+ builder.addStreamsProcessor("SetDeleteIdProcessor", setDeleteIdProcessor, 1, "DeleteOnlyProcessor");
+ builder.addStreamsPersistWriter("deleter", deleter, 1, "SetDeleteIdProcessor");
+
+ builder.start();
+
+ }
+
+ protected class SetDeleteIdProcessor implements StreamsProcessor {
+
+ public String getId() {
+ return "TwitterUserstreamElasticsearch.SetDeleteIdProcessor";
+ }
+
+ @Override
+ public List<StreamsDatum> process(StreamsDatum entry) {
+
+ Preconditions.checkArgument(entry.getDocument() instanceof Activity);
+ String id = entry.getId();
+ // replace delete with post in id
+ // ensure ElasticsearchPersistDeleter will remove original post if present
+ id = Strings.replace(id, "delete", "post");
+ entry.setId(id);
+
+ return Lists.newArrayList(entry);
+ }
+
+ @Override
+ public void prepare(Object configurationObject) {
+
+
+ }
+
+ @Override
+ public void cleanUp() {
+
+ }
+ }
+
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-userstream-elasticsearch/src/main/java/org/apache/streams/twitter/example/TwitterUserstreamElasticsearch.java
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/main/java/org/apache/streams/twitter/example/TwitterUserstreamElasticsearch.java b/local/twitter-userstream-elasticsearch/src/main/java/org/apache/streams/twitter/example/TwitterUserstreamElasticsearch.java
deleted file mode 100644
index c483742..0000000
--- a/local/twitter-userstream-elasticsearch/src/main/java/org/apache/streams/twitter/example/TwitterUserstreamElasticsearch.java
+++ /dev/null
@@ -1,146 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.twitter.example;
-
-import com.google.common.base.Preconditions;
-import com.google.common.collect.Lists;
-import com.google.common.collect.Maps;
-import com.google.common.collect.Sets;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.converter.ActivityConverterProcessor;
-import org.apache.streams.core.StreamsDatum;
-import org.apache.streams.core.StreamsProcessor;
-import org.apache.streams.elasticsearch.ElasticsearchPersistDeleter;
-import org.apache.streams.elasticsearch.ElasticsearchPersistWriter;
-import org.apache.streams.elasticsearch.ElasticsearchWriterConfiguration;
-import org.apache.streams.example.twitter.TwitterUserstreamElasticsearchConfiguration;
-import org.apache.streams.filters.VerbDefinitionDropFilter;
-import org.apache.streams.filters.VerbDefinitionKeepFilter;
-import org.apache.streams.local.builders.LocalStreamBuilder;
-import org.apache.streams.core.StreamBuilder;
-import org.apache.streams.pojo.json.Activity;
-import org.apache.streams.twitter.TwitterStreamConfiguration;
-import org.apache.streams.twitter.provider.TwitterStreamProvider;
-import org.apache.streams.verbs.ObjectCombination;
-import org.apache.streams.verbs.VerbDefinition;
-import org.elasticsearch.common.Strings;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.util.List;
-import java.util.Map;
-
-/**
- * Example stream that populates elasticsearch with activities from twitter userstream in real-time
- */
-public class TwitterUserstreamElasticsearch implements Runnable {
-
- public final static String STREAMS_ID = "TwitterUserstreamElasticsearch";
-
- private final static Logger LOGGER = LoggerFactory.getLogger(TwitterUserstreamElasticsearch.class);
-
- /* this pattern will match any/only deletes */
- private static VerbDefinition deleteVerbDefinition =
- new VerbDefinition()
- .withValue("delete")
- .withObjects(Lists.newArrayList(new ObjectCombination()));
-
- TwitterUserstreamElasticsearchConfiguration config;
-
- public TwitterUserstreamElasticsearch() {
- this(new ComponentConfigurator<>(TwitterUserstreamElasticsearchConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
-
- }
-
- public TwitterUserstreamElasticsearch(TwitterUserstreamElasticsearchConfiguration config) {
- this.config = config;
- }
-
- public static void main(String[] args)
- {
- LOGGER.info(StreamsConfigurator.config.toString());
-
- TwitterUserstreamElasticsearch userstream = new TwitterUserstreamElasticsearch();
- new Thread(userstream).start();
-
- }
-
- @Override
- public void run() {
-
- TwitterStreamConfiguration twitterStreamConfiguration = config.getTwitter();
- ElasticsearchWriterConfiguration elasticsearchWriterConfiguration = config.getElasticsearch();
-
- TwitterStreamProvider stream = new TwitterStreamProvider(twitterStreamConfiguration);
- ActivityConverterProcessor converter = new ActivityConverterProcessor();
- VerbDefinitionDropFilter noDeletesProcessor = new VerbDefinitionDropFilter(Sets.newHashSet(deleteVerbDefinition));
- ElasticsearchPersistWriter writer = new ElasticsearchPersistWriter(elasticsearchWriterConfiguration);
- VerbDefinitionKeepFilter deleteOnlyProcessor = new VerbDefinitionKeepFilter(Sets.newHashSet(deleteVerbDefinition));
- SetDeleteIdProcessor setDeleteIdProcessor = new SetDeleteIdProcessor();
- ElasticsearchPersistDeleter deleter = new ElasticsearchPersistDeleter(elasticsearchWriterConfiguration);
-
- Map<String, Object> streamConfig = Maps.newHashMap();
- streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 12 * 60 * 1000);
- StreamBuilder builder = new LocalStreamBuilder(25, streamConfig);
-
- builder.newPerpetualStream(TwitterStreamProvider.STREAMS_ID, stream);
- builder.addStreamsProcessor("converter", converter, 2, TwitterStreamProvider.STREAMS_ID);
- builder.addStreamsProcessor("NoDeletesProcessor", noDeletesProcessor, 1, "converter");
- builder.addStreamsPersistWriter(ElasticsearchPersistWriter.STREAMS_ID, writer, 1, "NoDeletesProcessor");
- builder.addStreamsProcessor("DeleteOnlyProcessor", deleteOnlyProcessor, 1, "converter");
- builder.addStreamsProcessor("SetDeleteIdProcessor", setDeleteIdProcessor, 1, "DeleteOnlyProcessor");
- builder.addStreamsPersistWriter("deleter", deleter, 1, "SetDeleteIdProcessor");
-
- builder.start();
-
- }
-
- protected class SetDeleteIdProcessor implements StreamsProcessor {
-
- public String getId() {
- return "TwitterUserstreamElasticsearch.SetDeleteIdProcessor";
- }
-
- @Override
- public List<StreamsDatum> process(StreamsDatum entry) {
-
- Preconditions.checkArgument(entry.getDocument() instanceof Activity);
- String id = entry.getId();
- // replace delete with post in id
- // ensure ElasticsearchPersistDeleter will remove original post if present
- id = Strings.replace(id, "delete", "post");
- entry.setId(id);
-
- return Lists.newArrayList(entry);
- }
-
- @Override
- public void prepare(Object configurationObject) {
-
-
- }
-
- @Override
- public void cleanUp() {
-
- }
- }
-
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-userstream-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterUserstreamElasticsearchConfiguration.json
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterUserstreamElasticsearchConfiguration.json b/local/twitter-userstream-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterUserstreamElasticsearchConfiguration.json
index 6a25850..7261439 100644
--- a/local/twitter-userstream-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterUserstreamElasticsearchConfiguration.json
+++ b/local/twitter-userstream-elasticsearch/src/main/jsonschema/org/apache/streams/example/twitter/TwitterUserstreamElasticsearchConfiguration.json
@@ -4,7 +4,7 @@
"http://www.apache.org/licenses/LICENSE-2.0"
],
"type": "object",
- "javaType" : "org.apache.streams.example.twitter.TwitterUserstreamElasticsearchConfiguration",
+ "javaType" : "org.apache.streams.example.TwitterUserstreamElasticsearchConfiguration",
"javaInterfaces": ["java.io.Serializable"],
"properties": {
"twitter": { "javaType": "org.apache.streams.twitter.TwitterStreamConfiguration", "type": "object", "required": true },
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-userstream-elasticsearch/src/test/java/org/apache/streams/example/test/TwitterUserstreamElasticsearchIT.java
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/test/java/org/apache/streams/example/test/TwitterUserstreamElasticsearchIT.java b/local/twitter-userstream-elasticsearch/src/test/java/org/apache/streams/example/test/TwitterUserstreamElasticsearchIT.java
new file mode 100644
index 0000000..7ba9940
--- /dev/null
+++ b/local/twitter-userstream-elasticsearch/src/test/java/org/apache/streams/example/test/TwitterUserstreamElasticsearchIT.java
@@ -0,0 +1,109 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example.test;
+
+import com.typesafe.config.Config;
+import com.typesafe.config.ConfigFactory;
+import com.typesafe.config.ConfigParseOptions;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfiguration;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.ElasticsearchClientManager;
+import org.apache.streams.example.TwitterUserstreamElasticsearchConfiguration;
+import org.apache.streams.example.TwitterUserstreamElasticsearch;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
+import org.elasticsearch.action.search.SearchRequestBuilder;
+import org.elasticsearch.action.search.SearchResponse;
+import org.elasticsearch.client.Client;
+import org.elasticsearch.client.Requests;
+import org.elasticsearch.cluster.health.ClusterHealthStatus;
+import org.junit.Before;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.InputStream;
+import java.util.Properties;
+
+import static junit.framework.TestCase.assertTrue;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertNotEquals;
+
+/**
+ * Test copying documents between two indexes on same cluster
+ */
+public class TwitterUserstreamElasticsearchIT {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(TwitterUserstreamElasticsearchIT.class);
+
+ protected TwitterUserstreamElasticsearchConfiguration testConfiguration;
+ protected Client testClient;
+
+ private int count = 0;
+
+ @Before
+ public void prepareTest() throws Exception {
+
+ Config reference = ConfigFactory.load();
+ File conf_file = new File("target/test-classes/TwitterUserstreamElasticsearchIT.conf");
+ assert(conf_file.exists());
+ Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
+ Properties es_properties = new Properties();
+ InputStream es_stream = new FileInputStream("elasticsearch.properties");
+ es_properties.load(es_stream);
+ Config esProps = ConfigFactory.parseProperties(es_properties);
+ Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
+ StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
+ testConfiguration = new ComponentConfigurator<>(TwitterUserstreamElasticsearchConfiguration.class).detectConfiguration(typesafe);
+ testClient = new ElasticsearchClientManager(testConfiguration.getElasticsearch()).getClient();
+
+ ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
+ ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
+ assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getElasticsearch().getIndex());
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ assertFalse(indicesExistsResponse.isExists());
+
+ }
+
+ @Test
+ public void testReindex() throws Exception {
+
+ TwitterUserstreamElasticsearch stream = new TwitterUserstreamElasticsearch(testConfiguration);
+
+ stream.run();
+
+ // assert lines in file
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getElasticsearch().getIndex())
+ .setTypes(testConfiguration.getElasticsearch().getType());
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ count = (int)countResponse.getHits().getTotalHits();
+
+ assertNotEquals(count, 0);
+ }
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/twitter-userstream-elasticsearch/src/test/java/org/apache/streams/example/twitter/test/TwitterUserstreamElasticsearchIT.java
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/test/java/org/apache/streams/example/twitter/test/TwitterUserstreamElasticsearchIT.java b/local/twitter-userstream-elasticsearch/src/test/java/org/apache/streams/example/twitter/test/TwitterUserstreamElasticsearchIT.java
deleted file mode 100644
index 2f524f0..0000000
--- a/local/twitter-userstream-elasticsearch/src/test/java/org/apache/streams/example/twitter/test/TwitterUserstreamElasticsearchIT.java
+++ /dev/null
@@ -1,111 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.example.twitter.test;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.typesafe.config.Config;
-import com.typesafe.config.ConfigFactory;
-import com.typesafe.config.ConfigParseOptions;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.elasticsearch.ElasticsearchClientManager;
-import org.apache.streams.example.twitter.TwitterUserstreamElasticsearchConfiguration;
-import org.apache.streams.jackson.StreamsJacksonMapper;
-import org.apache.streams.twitter.example.TwitterUserstreamElasticsearch;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
-import org.elasticsearch.action.search.SearchRequestBuilder;
-import org.elasticsearch.action.search.SearchResponse;
-import org.elasticsearch.client.Client;
-import org.elasticsearch.client.Requests;
-import org.elasticsearch.cluster.health.ClusterHealthStatus;
-import org.junit.Before;
-import org.junit.Test;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.InputStream;
-import java.util.Properties;
-
-import static junit.framework.TestCase.assertTrue;
-import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertFalse;
-import static org.junit.Assert.assertNotEquals;
-
-/**
- * Test copying documents between two indexes on same cluster
- */
-public class TwitterUserstreamElasticsearchIT {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(TwitterUserstreamElasticsearchIT.class);
-
- protected TwitterUserstreamElasticsearchConfiguration testConfiguration;
- protected Client testClient;
-
- private int count = 0;
-
- @Before
- public void prepareTest() throws Exception {
-
- Config reference = ConfigFactory.load();
- File conf_file = new File("target/test-classes/TwitterUserstreamElasticsearchIT.conf");
- assert(conf_file.exists());
- Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
- Properties es_properties = new Properties();
- InputStream es_stream = new FileInputStream("elasticsearch.properties");
- es_properties.load(es_stream);
- Config esProps = ConfigFactory.parseProperties(es_properties);
- Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
- testConfiguration = new ComponentConfigurator<>(TwitterUserstreamElasticsearchConfiguration.class).detectConfiguration(typesafe);
- testClient = new ElasticsearchClientManager(testConfiguration.getElasticsearch()).getClient();
-
- ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
- ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
- assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getElasticsearch().getIndex());
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- assertFalse(indicesExistsResponse.isExists());
-
- }
-
- @Test
- public void testReindex() throws Exception {
-
- TwitterUserstreamElasticsearch stream = new TwitterUserstreamElasticsearch(testConfiguration);
-
- stream.run();
-
- // assert lines in file
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getElasticsearch().getIndex())
- .setTypes(testConfiguration.getElasticsearch().getType());
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- count = (int)countResponse.getHits().getTotalHits();
-
- assertNotEquals(count, 0);
- }
-}
[4/9] incubator-streams-examples git commit: normalize package names
in streams-examples/local
Posted by sb...@apache.org.
normalize package names in streams-examples/local
Project: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/commit/5b96588c
Tree: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/tree/5b96588c
Diff: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/diff/5b96588c
Branch: refs/heads/master
Commit: 5b96588c492cfafa1732e5d5a680df8485bdb491
Parents: e949f58
Author: Steve Blackmon @steveblackmon <sb...@apache.org>
Authored: Tue Oct 11 16:30:11 2016 -0500
Committer: Steve Blackmon @steveblackmon <sb...@apache.org>
Committed: Tue Oct 11 16:39:07 2016 -0500
----------------------------------------------------------------------
.../example/ElasticsearchHdfs.java | 85 -----
.../example/HdfsElasticsearch.java | 86 -----
.../streams/example/ElasticsearchHdfs.java | 80 +++++
.../streams/example/HdfsElasticsearch.java | 80 +++++
.../ElasticsearchHdfsConfiguration.json | 2 +-
.../HdfsElasticsearchConfiguration.json | 2 +-
.../elasticsearch/test/ElasticsearchHdfsIT.java | 128 --------
.../example/elasticsearch/test/ExampleITs.java | 17 -
.../elasticsearch/test/HdfsElasticsearchIT.java | 132 --------
.../example/test/ElasticsearchHdfsIT.java | 112 +++++++
.../apache/streams/example/test/ExampleITs.java | 17 +
.../example/test/HdfsElasticsearchIT.java | 118 +++++++
.../example/ElasticsearchReindex.java | 84 -----
.../streams/example/ElasticsearchReindex.java | 80 +++++
.../ElasticsearchReindexConfiguration.json | 2 +-
.../test/ElasticsearchReindexChildIT.java | 121 -------
.../test/ElasticsearchReindexIT.java | 120 -------
.../test/ElasticsearchReindexParentIT.java | 133 --------
.../example/elasticsearch/test/ReindexITs.java | 20 --
.../test/ElasticsearchReindexChildIT.java | 121 +++++++
.../example/test/ElasticsearchReindexIT.java | 120 +++++++
.../test/ElasticsearchReindexParentIT.java | 133 ++++++++
.../apache/streams/example/test/ReindexITs.java | 20 ++
.../example/MongoElasticsearchSync.java | 79 -----
.../streams/example/MongoElasticsearchSync.java | 79 +++++
.../MongoElasticsearchSyncConfiguration.json | 2 +-
.../mongodb/test/MongoElasticsearchSyncIT.java | 121 -------
.../streams/example/mongodb/test/SyncITs.java | 16 -
.../example/test/MongoElasticsearchSyncIT.java | 117 +++++++
.../apache/streams/example/test/SyncITs.java | 16 +
.../src/test/resources/testSync.json | 21 --
local/pom.xml | 5 +-
local/twitter-follow-graph/README.md | 8 -
local/twitter-follow-graph/pom.xml | 316 -------------------
.../example/graph/TwitterFollowGraph.java | 103 ------
.../TwitterFollowGraphConfiguration.json | 13 -
.../src/main/resources/TwitterFollowGraph.dot | 39 ---
.../src/site/markdown/index.md | 75 -----
.../twitter/example/TwitterFollowGraphIT.java | 79 -----
.../test/resources/TwitterFollowGraphIT.conf | 28 --
local/twitter-follow-neo4j/README.md | 8 +
...itter-follow-graph-jar-with-dependencies.jar | Bin 0 -> 25829072 bytes
local/twitter-follow-neo4j/pom.xml | 316 +++++++++++++++++++
.../streams/example/TwitterFollowNeo4j.java | 93 ++++++
.../TwitterFollowNeo4jConfiguration.json | 13 +
.../src/main/resources/TwitterFollowNeo4j.dot | 39 +++
.../src/site/markdown/TwitterFollowNeo4j.md | 33 ++
.../src/site/markdown/index.md | 42 +++
.../src/site/resources/TwitterFollowGraph.dot | 39 +++
.../TwitterFollowGraphConfiguration.json | 13 +
.../src/site/resources/TwitterFollowNeo4j.dot | 39 +++
.../TwitterFollowNeo4jConfiguration.json | 13 +
local/twitter-follow-neo4j/src/site/site.xml | 45 +++
.../example/test/TwitterFollowNeo4jIT.java | 79 +++++
.../test/resources/TwitterFollowGraphIT.conf | 28 ++
.../example/TwitterHistoryElasticsearch.java | 81 +++++
.../twitter/TwitterHistoryElasticsearch.java | 90 ------
...witterHistoryElasticsearchConfiguration.json | 2 +-
.../test/TwitterHistoryElasticsearchIT.java | 108 +++++++
.../example/TwitterHistoryElasticsearchIT.java | 108 -------
.../example/TwitterUserstreamElasticsearch.java | 146 +++++++++
.../example/TwitterUserstreamElasticsearch.java | 146 ---------
...terUserstreamElasticsearchConfiguration.json | 2 +-
.../test/TwitterUserstreamElasticsearchIT.java | 109 +++++++
.../test/TwitterUserstreamElasticsearchIT.java | 111 -------
65 files changed, 2344 insertions(+), 2289 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/main/java/org/apache/streams/elasticsearch/example/ElasticsearchHdfs.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/main/java/org/apache/streams/elasticsearch/example/ElasticsearchHdfs.java b/local/elasticsearch-hdfs/src/main/java/org/apache/streams/elasticsearch/example/ElasticsearchHdfs.java
deleted file mode 100644
index da0acbd..0000000
--- a/local/elasticsearch-hdfs/src/main/java/org/apache/streams/elasticsearch/example/ElasticsearchHdfs.java
+++ /dev/null
@@ -1,85 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.elasticsearch.example;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.google.common.collect.Maps;
-import com.google.common.util.concurrent.ListeningExecutorService;
-import com.google.common.util.concurrent.MoreExecutors;
-import com.typesafe.config.Config;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.core.StreamsDatum;
-import org.apache.streams.elasticsearch.ElasticsearchPersistReader;
-import org.apache.streams.hdfs.WebHdfsPersistWriter;
-import org.apache.streams.core.StreamBuilder;
-import org.apache.streams.local.builders.LocalStreamBuilder;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.util.Map;
-import java.util.concurrent.*;
-
-/**
- * Copies documents into a new index
- */
-public class ElasticsearchHdfs implements Runnable {
-
- public final static String STREAMS_ID = "ElasticsearchHdfs";
-
- private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchHdfs.class);
-
- ElasticsearchHdfsConfiguration config;
-
- public ElasticsearchHdfs() {
- this(new ComponentConfigurator<>(ElasticsearchHdfsConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
-
- }
-
- public ElasticsearchHdfs(ElasticsearchHdfsConfiguration reindex) {
- this.config = reindex;
- }
-
- public static void main(String[] args)
- {
- LOGGER.info(StreamsConfigurator.config.toString());
-
- ElasticsearchHdfs backup = new ElasticsearchHdfs();
-
- new Thread(backup).start();
-
- }
-
- @Override
- public void run() {
-
- ElasticsearchPersistReader elasticsearchPersistReader = new ElasticsearchPersistReader(config.getSource());
-
- WebHdfsPersistWriter hdfsPersistWriter = new WebHdfsPersistWriter(config.getDestination());
-
- Map<String, Object> streamConfig = Maps.newHashMap();
- streamConfig.put(LocalStreamBuilder.STREAM_IDENTIFIER_KEY, STREAMS_ID);
- streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 7 * 24 * 60 * 1000);
- StreamBuilder builder = new LocalStreamBuilder(1000, streamConfig);
-
- builder.newPerpetualStream(ElasticsearchPersistReader.STREAMS_ID, elasticsearchPersistReader);
- builder.addStreamsPersistWriter(WebHdfsPersistWriter.STREAMS_ID, hdfsPersistWriter, 1, ElasticsearchPersistReader.STREAMS_ID);
- builder.start();
- }
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/main/java/org/apache/streams/elasticsearch/example/HdfsElasticsearch.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/main/java/org/apache/streams/elasticsearch/example/HdfsElasticsearch.java b/local/elasticsearch-hdfs/src/main/java/org/apache/streams/elasticsearch/example/HdfsElasticsearch.java
deleted file mode 100644
index 0a65479..0000000
--- a/local/elasticsearch-hdfs/src/main/java/org/apache/streams/elasticsearch/example/HdfsElasticsearch.java
+++ /dev/null
@@ -1,86 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.elasticsearch.example;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.google.common.collect.Maps;
-import com.google.common.util.concurrent.ListeningExecutorService;
-import com.google.common.util.concurrent.MoreExecutors;
-import com.typesafe.config.Config;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.core.StreamsDatum;
-import org.apache.streams.hdfs.WebHdfsPersistReader;
-import org.apache.streams.elasticsearch.ElasticsearchPersistWriter;
-import org.apache.streams.core.StreamBuilder;
-import org.apache.streams.local.builders.LocalStreamBuilder;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.math.BigInteger;
-import java.util.Map;
-import java.util.concurrent.*;
-
-/**
- * Copies documents into a new index
- */
-public class HdfsElasticsearch implements Runnable {
-
- public final static String STREAMS_ID = "HdfsElasticsearch";
-
- private final static Logger LOGGER = LoggerFactory.getLogger(HdfsElasticsearch.class);
-
- HdfsElasticsearchConfiguration config;
-
- public HdfsElasticsearch() {
- this(new ComponentConfigurator<>(HdfsElasticsearchConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
-
- }
-
- public HdfsElasticsearch(HdfsElasticsearchConfiguration reindex) {
- this.config = reindex;
- }
-
- public static void main(String[] args)
- {
- LOGGER.info(StreamsConfigurator.config.toString());
-
- HdfsElasticsearch restore = new HdfsElasticsearch();
-
- new Thread(restore).start();
-
- }
-
- @Override
- public void run() {
-
- WebHdfsPersistReader webHdfsPersistReader = new WebHdfsPersistReader(config.getSource());
-
- ElasticsearchPersistWriter elasticsearchPersistWriter = new ElasticsearchPersistWriter(config.getDestination());
-
- Map<String, Object> streamConfig = Maps.newHashMap();
- streamConfig.put(LocalStreamBuilder.STREAM_IDENTIFIER_KEY, STREAMS_ID);
- streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 1000 * 1000);
- StreamBuilder builder = new LocalStreamBuilder(1000, streamConfig);
-
- builder.newPerpetualStream(WebHdfsPersistReader.STREAMS_ID, webHdfsPersistReader);
- builder.addStreamsPersistWriter(ElasticsearchPersistWriter.STREAMS_ID, elasticsearchPersistWriter, 1, WebHdfsPersistReader.STREAMS_ID);
- builder.start();
- }
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/main/java/org/apache/streams/example/ElasticsearchHdfs.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/main/java/org/apache/streams/example/ElasticsearchHdfs.java b/local/elasticsearch-hdfs/src/main/java/org/apache/streams/example/ElasticsearchHdfs.java
new file mode 100644
index 0000000..8d3cf36
--- /dev/null
+++ b/local/elasticsearch-hdfs/src/main/java/org/apache/streams/example/ElasticsearchHdfs.java
@@ -0,0 +1,80 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example;
+
+import com.google.common.collect.Maps;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.ElasticsearchPersistReader;
+import org.apache.streams.example.ElasticsearchHdfsConfiguration;
+import org.apache.streams.hdfs.WebHdfsPersistWriter;
+import org.apache.streams.core.StreamBuilder;
+import org.apache.streams.local.builders.LocalStreamBuilder;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.util.Map;
+
+/**
+ * Copies documents into a new index
+ */
+public class ElasticsearchHdfs implements Runnable {
+
+ public final static String STREAMS_ID = "ElasticsearchHdfs";
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchHdfs.class);
+
+ ElasticsearchHdfsConfiguration config;
+
+ public ElasticsearchHdfs() {
+ this(new ComponentConfigurator<>(ElasticsearchHdfsConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
+
+ }
+
+ public ElasticsearchHdfs(ElasticsearchHdfsConfiguration reindex) {
+ this.config = reindex;
+ }
+
+ public static void main(String[] args)
+ {
+ LOGGER.info(StreamsConfigurator.config.toString());
+
+ ElasticsearchHdfs backup = new ElasticsearchHdfs();
+
+ new Thread(backup).start();
+
+ }
+
+ @Override
+ public void run() {
+
+ ElasticsearchPersistReader elasticsearchPersistReader = new ElasticsearchPersistReader(config.getSource());
+
+ WebHdfsPersistWriter hdfsPersistWriter = new WebHdfsPersistWriter(config.getDestination());
+
+ Map<String, Object> streamConfig = Maps.newHashMap();
+ streamConfig.put(LocalStreamBuilder.STREAM_IDENTIFIER_KEY, STREAMS_ID);
+ streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 7 * 24 * 60 * 1000);
+ StreamBuilder builder = new LocalStreamBuilder(1000, streamConfig);
+
+ builder.newPerpetualStream(ElasticsearchPersistReader.STREAMS_ID, elasticsearchPersistReader);
+ builder.addStreamsPersistWriter(WebHdfsPersistWriter.STREAMS_ID, hdfsPersistWriter, 1, ElasticsearchPersistReader.STREAMS_ID);
+ builder.start();
+ }
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/main/java/org/apache/streams/example/HdfsElasticsearch.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/main/java/org/apache/streams/example/HdfsElasticsearch.java b/local/elasticsearch-hdfs/src/main/java/org/apache/streams/example/HdfsElasticsearch.java
new file mode 100644
index 0000000..847ac48
--- /dev/null
+++ b/local/elasticsearch-hdfs/src/main/java/org/apache/streams/example/HdfsElasticsearch.java
@@ -0,0 +1,80 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example;
+
+import com.google.common.collect.Maps;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.example.HdfsElasticsearchConfiguration;
+import org.apache.streams.hdfs.WebHdfsPersistReader;
+import org.apache.streams.elasticsearch.ElasticsearchPersistWriter;
+import org.apache.streams.core.StreamBuilder;
+import org.apache.streams.local.builders.LocalStreamBuilder;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.util.Map;
+
+/**
+ * Copies documents into a new index
+ */
+public class HdfsElasticsearch implements Runnable {
+
+ public final static String STREAMS_ID = "HdfsElasticsearch";
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(HdfsElasticsearch.class);
+
+ HdfsElasticsearchConfiguration config;
+
+ public HdfsElasticsearch() {
+ this(new ComponentConfigurator<>(HdfsElasticsearchConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
+
+ }
+
+ public HdfsElasticsearch(HdfsElasticsearchConfiguration reindex) {
+ this.config = reindex;
+ }
+
+ public static void main(String[] args)
+ {
+ LOGGER.info(StreamsConfigurator.config.toString());
+
+ HdfsElasticsearch restore = new HdfsElasticsearch();
+
+ new Thread(restore).start();
+
+ }
+
+ @Override
+ public void run() {
+
+ WebHdfsPersistReader webHdfsPersistReader = new WebHdfsPersistReader(config.getSource());
+
+ ElasticsearchPersistWriter elasticsearchPersistWriter = new ElasticsearchPersistWriter(config.getDestination());
+
+ Map<String, Object> streamConfig = Maps.newHashMap();
+ streamConfig.put(LocalStreamBuilder.STREAM_IDENTIFIER_KEY, STREAMS_ID);
+ streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 1000 * 1000);
+ StreamBuilder builder = new LocalStreamBuilder(1000, streamConfig);
+
+ builder.newPerpetualStream(WebHdfsPersistReader.STREAMS_ID, webHdfsPersistReader);
+ builder.addStreamsPersistWriter(ElasticsearchPersistWriter.STREAMS_ID, elasticsearchPersistWriter, 1, WebHdfsPersistReader.STREAMS_ID);
+ builder.start();
+ }
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/main/jsonschema/ElasticsearchHdfsConfiguration.json
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/main/jsonschema/ElasticsearchHdfsConfiguration.json b/local/elasticsearch-hdfs/src/main/jsonschema/ElasticsearchHdfsConfiguration.json
index 9ad7e54..ee17a3d 100644
--- a/local/elasticsearch-hdfs/src/main/jsonschema/ElasticsearchHdfsConfiguration.json
+++ b/local/elasticsearch-hdfs/src/main/jsonschema/ElasticsearchHdfsConfiguration.json
@@ -4,7 +4,7 @@
"http://www.apache.org/licenses/LICENSE-2.0"
],
"type": "object",
- "javaType" : "org.apache.streams.elasticsearch.example.ElasticsearchHdfsConfiguration",
+ "javaType" : "org.apache.streams.example.ElasticsearchHdfsConfiguration",
"javaInterfaces": ["java.io.Serializable"],
"properties": {
"source": { "javaType": "org.apache.streams.elasticsearch.ElasticsearchReaderConfiguration", "type": "object", "required": true },
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/main/jsonschema/HdfsElasticsearchConfiguration.json
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/main/jsonschema/HdfsElasticsearchConfiguration.json b/local/elasticsearch-hdfs/src/main/jsonschema/HdfsElasticsearchConfiguration.json
index 8b77225..4239279 100644
--- a/local/elasticsearch-hdfs/src/main/jsonschema/HdfsElasticsearchConfiguration.json
+++ b/local/elasticsearch-hdfs/src/main/jsonschema/HdfsElasticsearchConfiguration.json
@@ -4,7 +4,7 @@
"http://www.apache.org/licenses/LICENSE-2.0"
],
"type": "object",
- "javaType" : "org.apache.streams.elasticsearch.example.HdfsElasticsearchConfiguration",
+ "javaType" : "org.apache.streams.example.HdfsElasticsearchConfiguration",
"javaInterfaces": ["java.io.Serializable"],
"properties": {
"source": { "javaType": "org.apache.streams.hdfs.HdfsReaderConfiguration", "type": "object", "required": true },
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchHdfsIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchHdfsIT.java b/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchHdfsIT.java
deleted file mode 100644
index 8dfe244..0000000
--- a/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchHdfsIT.java
+++ /dev/null
@@ -1,128 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.example.elasticsearch.test;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.google.common.collect.Lists;
-import com.google.common.util.concurrent.Uninterruptibles;
-import com.typesafe.config.Config;
-import com.typesafe.config.ConfigFactory;
-import com.typesafe.config.ConfigParseOptions;
-import org.apache.commons.io.Charsets;
-import org.apache.commons.io.IOUtils;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.core.StreamsDatum;
-import org.apache.streams.elasticsearch.ElasticsearchClientManager;
-import org.apache.streams.elasticsearch.ElasticsearchConfiguration;
-import org.apache.streams.elasticsearch.ElasticsearchPersistWriter;
-import org.apache.streams.elasticsearch.ElasticsearchReaderConfiguration;
-import org.apache.streams.elasticsearch.ElasticsearchWriterConfiguration;
-import org.apache.streams.elasticsearch.example.ElasticsearchHdfs;
-import org.apache.streams.elasticsearch.example.ElasticsearchHdfsConfiguration;
-import org.apache.streams.elasticsearch.example.HdfsElasticsearch;
-import org.apache.streams.elasticsearch.example.HdfsElasticsearchConfiguration;
-import org.apache.streams.jackson.StreamsJacksonMapper;
-import org.apache.streams.pojo.json.Activity;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
-import org.elasticsearch.action.admin.indices.delete.DeleteIndexRequest;
-import org.elasticsearch.action.admin.indices.delete.DeleteIndexResponse;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
-import org.elasticsearch.action.search.SearchRequestBuilder;
-import org.elasticsearch.action.search.SearchResponse;
-import org.elasticsearch.client.Client;
-import org.elasticsearch.client.Requests;
-import org.elasticsearch.cluster.health.ClusterHealthStatus;
-import org.junit.Before;
-import org.junit.Test;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.InputStream;
-import java.util.List;
-import java.util.Properties;
-import java.util.concurrent.TimeUnit;
-
-import static junit.framework.TestCase.assertTrue;
-import static org.junit.Assert.assertNotEquals;
-
-/**
- * Test copying documents between hdfs and elasticsearch
- */
-public class ElasticsearchHdfsIT {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchHdfsIT.class);
-
- ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
-
- protected ElasticsearchHdfsConfiguration testConfiguration;
- protected Client testClient;
-
- private int count = 0;
-
- @Before
- public void prepareTest() throws Exception {
-
- Config reference = ConfigFactory.load();
- File conf_file = new File("target/test-classes/ElasticsearchHdfsIT.conf");
- assert(conf_file.exists());
- Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
- Properties es_properties = new Properties();
- InputStream es_stream = new FileInputStream("elasticsearch.properties");
- es_properties.load(es_stream);
- Config esProps = ConfigFactory.parseProperties(es_properties);
- Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
- testConfiguration = new ComponentConfigurator<>(ElasticsearchHdfsConfiguration.class).detectConfiguration(typesafe);
- testClient = new ElasticsearchClientManager(testConfiguration.getSource()).getClient();
-
- ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
- ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
- assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getSource().getIndexes().get(0));
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- assertTrue(indicesExistsResponse.isExists());
-
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getSource().getIndexes().get(0))
- .setTypes(testConfiguration.getSource().getTypes().get(0));
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- count = (int)countResponse.getHits().getTotalHits();
-
- assertNotEquals(count, 0);
- }
-
- @Test
- public void ElasticsearchHdfsIT() throws Exception {
-
- ElasticsearchHdfs backup = new ElasticsearchHdfs(testConfiguration);
-
- backup.run();
-
- // assert lines in file
- }
-
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/ExampleITs.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/ExampleITs.java b/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/ExampleITs.java
deleted file mode 100644
index ab882c8..0000000
--- a/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/ExampleITs.java
+++ /dev/null
@@ -1,17 +0,0 @@
-package org.apache.streams.example.elasticsearch.test;
-
-import org.apache.streams.elasticsearch.test.ElasticsearchPersistWriterIT;
-import org.junit.runner.RunWith;
-import org.junit.runners.Suite;
-
-@RunWith(Suite.class)
-@Suite.SuiteClasses({
- ElasticsearchPersistWriterIT.class,
- ElasticsearchHdfsIT.class,
- HdfsElasticsearchIT.class,
-})
-
-public class ExampleITs {
- // the class remains empty,
- // used only as a holder for the above annotations
-}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/HdfsElasticsearchIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/HdfsElasticsearchIT.java b/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/HdfsElasticsearchIT.java
deleted file mode 100644
index 1a055f6..0000000
--- a/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/elasticsearch/test/HdfsElasticsearchIT.java
+++ /dev/null
@@ -1,132 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.example.elasticsearch.test;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.google.common.collect.Lists;
-import com.google.common.util.concurrent.Uninterruptibles;
-import com.typesafe.config.Config;
-import com.typesafe.config.ConfigFactory;
-import com.typesafe.config.ConfigParseOptions;
-import org.apache.commons.io.Charsets;
-import org.apache.commons.io.IOUtils;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.core.StreamsDatum;
-import org.apache.streams.elasticsearch.ElasticsearchClientManager;
-import org.apache.streams.elasticsearch.ElasticsearchConfiguration;
-import org.apache.streams.elasticsearch.ElasticsearchPersistWriter;
-import org.apache.streams.elasticsearch.ElasticsearchReaderConfiguration;
-import org.apache.streams.elasticsearch.ElasticsearchWriterConfiguration;
-import org.apache.streams.elasticsearch.example.ElasticsearchHdfs;
-import org.apache.streams.elasticsearch.example.ElasticsearchHdfsConfiguration;
-import org.apache.streams.elasticsearch.example.HdfsElasticsearch;
-import org.apache.streams.elasticsearch.example.HdfsElasticsearchConfiguration;
-import org.apache.streams.jackson.StreamsJacksonMapper;
-import org.apache.streams.pojo.json.Activity;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
-import org.elasticsearch.action.admin.indices.delete.DeleteIndexRequest;
-import org.elasticsearch.action.admin.indices.delete.DeleteIndexResponse;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
-import org.elasticsearch.action.search.SearchRequestBuilder;
-import org.elasticsearch.action.search.SearchResponse;
-import org.elasticsearch.client.Client;
-import org.elasticsearch.client.Requests;
-import org.elasticsearch.cluster.health.ClusterHealthStatus;
-import org.junit.Before;
-import org.junit.Test;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.InputStream;
-import java.util.List;
-import java.util.Properties;
-import java.util.concurrent.TimeUnit;
-
-import static junit.framework.TestCase.assertTrue;
-import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertNotEquals;
-
-/**
- * Test copying documents between hdfs and elasticsearch
- */
-public class HdfsElasticsearchIT {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(HdfsElasticsearchIT.class);
-
- ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
-
- protected HdfsElasticsearchConfiguration testConfiguration;
- protected Client testClient;
-
- @Before
- public void prepareTest() throws Exception {
-
- Config reference = ConfigFactory.load();
- File conf_file = new File("target/test-classes/HdfsElasticsearchIT.conf");
- assert(conf_file.exists());
- Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
- Properties es_properties = new Properties();
- InputStream es_stream = new FileInputStream("elasticsearch.properties");
- es_properties.load(es_stream);
- Config esProps = ConfigFactory.parseProperties(es_properties);
- Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
- testConfiguration = new ComponentConfigurator<>(HdfsElasticsearchConfiguration.class).detectConfiguration(typesafe);
- testClient = new ElasticsearchClientManager(testConfiguration.getDestination()).getClient();
-
- ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
- ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
- assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getDestination().getIndex());
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- if(indicesExistsResponse.isExists()) {
- DeleteIndexRequest deleteIndexRequest = Requests.deleteIndexRequest(testConfiguration.getDestination().getIndex());
- DeleteIndexResponse deleteIndexResponse = testClient.admin().indices().delete(deleteIndexRequest).actionGet();
- assertTrue(deleteIndexResponse.isAcknowledged());
- };
- }
-
- @Test
- public void ElasticsearchHdfsIT() throws Exception {
-
- HdfsElasticsearch restore = new HdfsElasticsearch(testConfiguration);
-
- restore.run();
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getDestination().getIndex());
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- assertTrue(indicesExistsResponse.isExists());
-
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getDestination().getIndex())
- .setTypes(testConfiguration.getDestination().getType());
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- assertEquals(89, countResponse.getHits().getTotalHits());
-
- }
-
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/ElasticsearchHdfsIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/ElasticsearchHdfsIT.java b/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/ElasticsearchHdfsIT.java
new file mode 100644
index 0000000..8e87f3a
--- /dev/null
+++ b/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/ElasticsearchHdfsIT.java
@@ -0,0 +1,112 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example.test;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.typesafe.config.Config;
+import com.typesafe.config.ConfigFactory;
+import com.typesafe.config.ConfigParseOptions;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfiguration;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.ElasticsearchClientManager;
+import org.apache.streams.example.ElasticsearchHdfs;
+import org.apache.streams.example.ElasticsearchHdfsConfiguration;
+import org.apache.streams.jackson.StreamsJacksonMapper;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
+import org.elasticsearch.action.search.SearchRequestBuilder;
+import org.elasticsearch.action.search.SearchResponse;
+import org.elasticsearch.client.Client;
+import org.elasticsearch.client.Requests;
+import org.elasticsearch.cluster.health.ClusterHealthStatus;
+import org.junit.Before;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.InputStream;
+import java.util.Properties;
+
+import static junit.framework.TestCase.assertTrue;
+import static org.junit.Assert.assertNotEquals;
+
+/**
+ * Test copying documents between hdfs and elasticsearch
+ */
+public class ElasticsearchHdfsIT {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchHdfsIT.class);
+
+ ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
+
+ protected ElasticsearchHdfsConfiguration testConfiguration;
+ protected Client testClient;
+
+ private int count = 0;
+
+ @Before
+ public void prepareTest() throws Exception {
+
+ Config reference = ConfigFactory.load();
+ File conf_file = new File("target/test-classes/ElasticsearchHdfsIT.conf");
+ assert(conf_file.exists());
+ Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
+ Properties es_properties = new Properties();
+ InputStream es_stream = new FileInputStream("elasticsearch.properties");
+ es_properties.load(es_stream);
+ Config esProps = ConfigFactory.parseProperties(es_properties);
+ Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
+ StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
+ testConfiguration = new ComponentConfigurator<>(ElasticsearchHdfsConfiguration.class).detectConfiguration(typesafe);
+ testClient = new ElasticsearchClientManager(testConfiguration.getSource()).getClient();
+
+ ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
+ ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
+ assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getSource().getIndexes().get(0));
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ assertTrue(indicesExistsResponse.isExists());
+
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getSource().getIndexes().get(0))
+ .setTypes(testConfiguration.getSource().getTypes().get(0));
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ count = (int)countResponse.getHits().getTotalHits();
+
+ assertNotEquals(count, 0);
+ }
+
+ @Test
+ public void ElasticsearchHdfsIT() throws Exception {
+
+ ElasticsearchHdfs backup = new ElasticsearchHdfs(testConfiguration);
+
+ backup.run();
+
+ // assert lines in file
+ }
+
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/ExampleITs.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/ExampleITs.java b/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/ExampleITs.java
new file mode 100644
index 0000000..5965914
--- /dev/null
+++ b/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/ExampleITs.java
@@ -0,0 +1,17 @@
+package org.apache.streams.example.test;
+
+import org.apache.streams.elasticsearch.test.ElasticsearchPersistWriterIT;
+import org.junit.runner.RunWith;
+import org.junit.runners.Suite;
+
+@RunWith(Suite.class)
+@Suite.SuiteClasses({
+ ElasticsearchPersistWriterIT.class,
+ ElasticsearchHdfsIT.class,
+ HdfsElasticsearchIT.class,
+})
+
+public class ExampleITs {
+ // the class remains empty,
+ // used only as a holder for the above annotations
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/HdfsElasticsearchIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/HdfsElasticsearchIT.java b/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/HdfsElasticsearchIT.java
new file mode 100644
index 0000000..4eb7fc0
--- /dev/null
+++ b/local/elasticsearch-hdfs/src/test/java/org/apache/streams/example/test/HdfsElasticsearchIT.java
@@ -0,0 +1,118 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example.test;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.typesafe.config.Config;
+import com.typesafe.config.ConfigFactory;
+import com.typesafe.config.ConfigParseOptions;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfiguration;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.ElasticsearchClientManager;
+import org.apache.streams.example.HdfsElasticsearch;
+import org.apache.streams.example.HdfsElasticsearchConfiguration;
+import org.apache.streams.jackson.StreamsJacksonMapper;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
+import org.elasticsearch.action.admin.indices.delete.DeleteIndexRequest;
+import org.elasticsearch.action.admin.indices.delete.DeleteIndexResponse;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
+import org.elasticsearch.action.search.SearchRequestBuilder;
+import org.elasticsearch.action.search.SearchResponse;
+import org.elasticsearch.client.Client;
+import org.elasticsearch.client.Requests;
+import org.elasticsearch.cluster.health.ClusterHealthStatus;
+import org.junit.Before;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.InputStream;
+import java.util.Properties;
+
+import static junit.framework.TestCase.assertTrue;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotEquals;
+
+/**
+ * Test copying documents between hdfs and elasticsearch
+ */
+public class HdfsElasticsearchIT {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(HdfsElasticsearchIT.class);
+
+ ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
+
+ protected HdfsElasticsearchConfiguration testConfiguration;
+ protected Client testClient;
+
+ @Before
+ public void prepareTest() throws Exception {
+
+ Config reference = ConfigFactory.load();
+ File conf_file = new File("target/test-classes/HdfsElasticsearchIT.conf");
+ assert(conf_file.exists());
+ Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
+ Properties es_properties = new Properties();
+ InputStream es_stream = new FileInputStream("elasticsearch.properties");
+ es_properties.load(es_stream);
+ Config esProps = ConfigFactory.parseProperties(es_properties);
+ Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
+ StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
+ testConfiguration = new ComponentConfigurator<>(HdfsElasticsearchConfiguration.class).detectConfiguration(typesafe);
+ testClient = new ElasticsearchClientManager(testConfiguration.getDestination()).getClient();
+
+ ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
+ ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
+ assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getDestination().getIndex());
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ if(indicesExistsResponse.isExists()) {
+ DeleteIndexRequest deleteIndexRequest = Requests.deleteIndexRequest(testConfiguration.getDestination().getIndex());
+ DeleteIndexResponse deleteIndexResponse = testClient.admin().indices().delete(deleteIndexRequest).actionGet();
+ assertTrue(deleteIndexResponse.isAcknowledged());
+ };
+ }
+
+ @Test
+ public void ElasticsearchHdfsIT() throws Exception {
+
+ HdfsElasticsearch restore = new HdfsElasticsearch(testConfiguration);
+
+ restore.run();
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getDestination().getIndex());
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ assertTrue(indicesExistsResponse.isExists());
+
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getDestination().getIndex())
+ .setTypes(testConfiguration.getDestination().getType());
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ assertEquals(89, countResponse.getHits().getTotalHits());
+
+ }
+
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/main/java/org/apache/streams/elasticsearch/example/ElasticsearchReindex.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/main/java/org/apache/streams/elasticsearch/example/ElasticsearchReindex.java b/local/elasticsearch-reindex/src/main/java/org/apache/streams/elasticsearch/example/ElasticsearchReindex.java
deleted file mode 100644
index dc94773..0000000
--- a/local/elasticsearch-reindex/src/main/java/org/apache/streams/elasticsearch/example/ElasticsearchReindex.java
+++ /dev/null
@@ -1,84 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.elasticsearch.example;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.google.common.collect.Maps;
-import com.google.common.util.concurrent.ListeningExecutorService;
-import com.google.common.util.concurrent.MoreExecutors;
-import com.typesafe.config.Config;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.core.StreamsDatum;
-import org.apache.streams.elasticsearch.*;
-import org.apache.streams.core.StreamBuilder;
-import org.apache.streams.local.builders.LocalStreamBuilder;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.util.Map;
-import java.util.concurrent.*;
-
-/**
- * Copies documents into a new index
- */
-public class ElasticsearchReindex implements Runnable {
-
- public final static String STREAMS_ID = "ElasticsearchReindex";
-
- private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchReindex.class);
-
- ElasticsearchReindexConfiguration config;
-
- public ElasticsearchReindex() {
- this(new ComponentConfigurator<>(ElasticsearchReindexConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
-
- }
-
- public ElasticsearchReindex(ElasticsearchReindexConfiguration reindex) {
- this.config = reindex;
- }
-
- public static void main(String[] args)
- {
- LOGGER.info(StreamsConfigurator.config.toString());
-
- ElasticsearchReindex reindex = new ElasticsearchReindex();
-
- new Thread(reindex).start();
-
- }
-
- @Override
- public void run() {
-
- ElasticsearchPersistReader elasticsearchPersistReader = new ElasticsearchPersistReader(config.getSource());
-
- ElasticsearchPersistWriter elasticsearchPersistWriter = new ElasticsearchPersistWriter(config.getDestination());
-
- Map<String, Object> streamConfig = Maps.newHashMap();
- streamConfig.put(LocalStreamBuilder.STREAM_IDENTIFIER_KEY, STREAMS_ID);
- streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 7 * 24 * 60 * 1000);
- StreamBuilder builder = new LocalStreamBuilder(1000, streamConfig);
-
- builder.newPerpetualStream(ElasticsearchPersistReader.STREAMS_ID, elasticsearchPersistReader);
- builder.addStreamsPersistWriter(ElasticsearchPersistWriter.STREAMS_ID, elasticsearchPersistWriter, 1, ElasticsearchPersistReader.STREAMS_ID);
- builder.start();
- }
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/main/java/org/apache/streams/example/ElasticsearchReindex.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/main/java/org/apache/streams/example/ElasticsearchReindex.java b/local/elasticsearch-reindex/src/main/java/org/apache/streams/example/ElasticsearchReindex.java
new file mode 100644
index 0000000..dfb2a98
--- /dev/null
+++ b/local/elasticsearch-reindex/src/main/java/org/apache/streams/example/ElasticsearchReindex.java
@@ -0,0 +1,80 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example;
+
+import com.google.common.collect.Maps;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.core.StreamBuilder;
+import org.apache.streams.elasticsearch.ElasticsearchPersistReader;
+import org.apache.streams.elasticsearch.ElasticsearchPersistWriter;
+import org.apache.streams.example.ElasticsearchReindexConfiguration;
+import org.apache.streams.local.builders.LocalStreamBuilder;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.util.Map;
+
+/**
+ * Copies documents into a new index
+ */
+public class ElasticsearchReindex implements Runnable {
+
+ public final static String STREAMS_ID = "ElasticsearchReindex";
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchReindex.class);
+
+ ElasticsearchReindexConfiguration config;
+
+ public ElasticsearchReindex() {
+ this(new ComponentConfigurator<>(ElasticsearchReindexConfiguration.class).detectConfiguration(StreamsConfigurator.getConfig()));
+
+ }
+
+ public ElasticsearchReindex(ElasticsearchReindexConfiguration reindex) {
+ this.config = reindex;
+ }
+
+ public static void main(String[] args)
+ {
+ LOGGER.info(StreamsConfigurator.config.toString());
+
+ ElasticsearchReindex reindex = new ElasticsearchReindex();
+
+ new Thread(reindex).start();
+
+ }
+
+ @Override
+ public void run() {
+
+ ElasticsearchPersistReader elasticsearchPersistReader = new ElasticsearchPersistReader(config.getSource());
+
+ ElasticsearchPersistWriter elasticsearchPersistWriter = new ElasticsearchPersistWriter(config.getDestination());
+
+ Map<String, Object> streamConfig = Maps.newHashMap();
+ streamConfig.put(LocalStreamBuilder.STREAM_IDENTIFIER_KEY, STREAMS_ID);
+ streamConfig.put(LocalStreamBuilder.TIMEOUT_KEY, 7 * 24 * 60 * 1000);
+ StreamBuilder builder = new LocalStreamBuilder(1000, streamConfig);
+
+ builder.newPerpetualStream(ElasticsearchPersistReader.STREAMS_ID, elasticsearchPersistReader);
+ builder.addStreamsPersistWriter(ElasticsearchPersistWriter.STREAMS_ID, elasticsearchPersistWriter, 1, ElasticsearchPersistReader.STREAMS_ID);
+ builder.start();
+ }
+}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/main/jsonschema/ElasticsearchReindexConfiguration.json
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/main/jsonschema/ElasticsearchReindexConfiguration.json b/local/elasticsearch-reindex/src/main/jsonschema/ElasticsearchReindexConfiguration.json
index 1237538..09bdf5b 100644
--- a/local/elasticsearch-reindex/src/main/jsonschema/ElasticsearchReindexConfiguration.json
+++ b/local/elasticsearch-reindex/src/main/jsonschema/ElasticsearchReindexConfiguration.json
@@ -4,7 +4,7 @@
"http://www.apache.org/licenses/LICENSE-2.0"
],
"type": "object",
- "javaType" : "org.apache.streams.elasticsearch.example.ElasticsearchReindexConfiguration",
+ "javaType" : "org.apache.streams.example.ElasticsearchReindexConfiguration",
"javaInterfaces": ["java.io.Serializable"],
"properties": {
"source": { "javaType": "org.apache.streams.elasticsearch.ElasticsearchReaderConfiguration", "type": "object", "required": true },
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexChildIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexChildIT.java b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexChildIT.java
deleted file mode 100644
index d033014..0000000
--- a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexChildIT.java
+++ /dev/null
@@ -1,121 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.example.elasticsearch.test;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.typesafe.config.Config;
-import com.typesafe.config.ConfigFactory;
-import com.typesafe.config.ConfigParseOptions;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.elasticsearch.ElasticsearchClientManager;
-import org.apache.streams.elasticsearch.example.ElasticsearchReindex;
-import org.apache.streams.elasticsearch.example.ElasticsearchReindexConfiguration;
-import org.apache.streams.jackson.StreamsJacksonMapper;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
-import org.elasticsearch.action.search.SearchRequestBuilder;
-import org.elasticsearch.action.search.SearchResponse;
-import org.elasticsearch.client.Client;
-import org.elasticsearch.client.Requests;
-import org.elasticsearch.cluster.health.ClusterHealthStatus;
-import org.junit.Before;
-import org.junit.Test;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.InputStream;
-import java.util.Properties;
-
-import static junit.framework.TestCase.assertTrue;
-import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertNotEquals;
-
-/**
- * Test copying parent/child associated documents between two indexes on same cluster
- */
-public class ElasticsearchReindexChildIT {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchReindexIT.class);
-
- ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
-
- protected ElasticsearchReindexConfiguration testConfiguration;
- protected Client testClient;
-
- private int count = 0;
-
- @Before
- public void prepareTest() throws Exception {
-
- Config reference = ConfigFactory.load();
- File conf_file = new File("target/test-classes/ElasticsearchReindexChildIT.conf");
- assert(conf_file.exists());
- Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
- Properties es_properties = new Properties();
- InputStream es_stream = new FileInputStream("elasticsearch.properties");
- es_properties.load(es_stream);
- Config esProps = ConfigFactory.parseProperties(es_properties);
- Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
- testConfiguration = new ComponentConfigurator<>(ElasticsearchReindexConfiguration.class).detectConfiguration(typesafe);
- testClient = new ElasticsearchClientManager(testConfiguration.getSource()).getClient();
-
- ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
- ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
- assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getSource().getIndexes().get(0));
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- assertTrue(indicesExistsResponse.isExists());
-
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getSource().getIndexes().get(0))
- .setTypes(testConfiguration.getSource().getTypes().get(0));
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- count = (int)countResponse.getHits().getTotalHits();
-
- assertNotEquals(count, 0);
-
- }
-
- @Test
- public void testReindex() throws Exception {
-
- ElasticsearchReindex reindex = new ElasticsearchReindex(testConfiguration);
-
- reindex.run();
-
- // assert lines in file
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getDestination().getIndex())
- .setTypes(testConfiguration.getDestination().getType());
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- assertEquals(count, (int)countResponse.getHits().getTotalHits());
-
- }
-
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexIT.java b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexIT.java
deleted file mode 100644
index 5854ac0..0000000
--- a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexIT.java
+++ /dev/null
@@ -1,120 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.example.elasticsearch.test;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.typesafe.config.Config;
-import com.typesafe.config.ConfigFactory;
-import com.typesafe.config.ConfigParseOptions;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.elasticsearch.ElasticsearchClientManager;
-import org.apache.streams.elasticsearch.example.ElasticsearchReindex;
-import org.apache.streams.elasticsearch.example.ElasticsearchReindexConfiguration;
-import org.apache.streams.jackson.StreamsJacksonMapper;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
-import org.elasticsearch.action.search.SearchRequestBuilder;
-import org.elasticsearch.action.search.SearchResponse;
-import org.elasticsearch.client.Client;
-import org.elasticsearch.client.Requests;
-import org.elasticsearch.cluster.health.ClusterHealthStatus;
-import org.junit.Before;
-import org.junit.Test;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.InputStream;
-import java.util.Properties;
-
-import static junit.framework.TestCase.assertTrue;
-import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertNotEquals;
-
-/**
- * Test copying documents between two indexes on same cluster
- */
-public class ElasticsearchReindexIT {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchReindexIT.class);
-
- ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
-
- protected ElasticsearchReindexConfiguration testConfiguration;
- protected Client testClient;
-
- private int count = 0;
-
- @Before
- public void prepareTest() throws Exception {
-
- Config reference = ConfigFactory.load();
- File conf_file = new File("target/test-classes/ElasticsearchReindexIT.conf");
- assert(conf_file.exists());
- Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
- Properties es_properties = new Properties();
- InputStream es_stream = new FileInputStream("elasticsearch.properties");
- es_properties.load(es_stream);
- Config esProps = ConfigFactory.parseProperties(es_properties);
- Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
- testConfiguration = new ComponentConfigurator<>(ElasticsearchReindexConfiguration.class).detectConfiguration(typesafe);
- testClient = new ElasticsearchClientManager(testConfiguration.getSource()).getClient();
-
- ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
- ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
- assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getSource().getIndexes().get(0));
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- assertTrue(indicesExistsResponse.isExists());
-
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getSource().getIndexes().get(0))
- .setTypes(testConfiguration.getSource().getTypes().get(0));
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- count = (int)countResponse.getHits().getTotalHits();
-
- assertNotEquals(count, 0);
-
- }
-
- @Test
- public void testReindex() throws Exception {
-
- ElasticsearchReindex reindex = new ElasticsearchReindex(testConfiguration);
-
- reindex.run();
-
- // assert lines in file
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getDestination().getIndex())
- .setTypes(testConfiguration.getDestination().getType());
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- assertEquals(count, (int)countResponse.getHits().getTotalHits());
-
- }
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexParentIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexParentIT.java b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexParentIT.java
deleted file mode 100644
index 90924e7..0000000
--- a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ElasticsearchReindexParentIT.java
+++ /dev/null
@@ -1,133 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied. See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.streams.example.elasticsearch.test;
-
-import com.fasterxml.jackson.databind.ObjectMapper;
-import com.fasterxml.jackson.databind.node.ObjectNode;
-import com.typesafe.config.Config;
-import com.typesafe.config.ConfigFactory;
-import com.typesafe.config.ConfigParseOptions;
-import org.apache.streams.config.ComponentConfigurator;
-import org.apache.streams.config.StreamsConfiguration;
-import org.apache.streams.config.StreamsConfigurator;
-import org.apache.streams.elasticsearch.ElasticsearchClientManager;
-import org.apache.streams.elasticsearch.example.ElasticsearchReindex;
-import org.apache.streams.elasticsearch.example.ElasticsearchReindexConfiguration;
-import org.apache.streams.elasticsearch.test.ElasticsearchParentChildWriterIT;
-import org.apache.streams.jackson.StreamsJacksonMapper;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
-import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
-import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
-import org.elasticsearch.action.admin.indices.template.put.PutIndexTemplateRequestBuilder;
-import org.elasticsearch.action.search.SearchRequestBuilder;
-import org.elasticsearch.action.search.SearchResponse;
-import org.elasticsearch.client.Client;
-import org.elasticsearch.client.Requests;
-import org.elasticsearch.cluster.health.ClusterHealthStatus;
-import org.junit.Before;
-import org.junit.Test;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.InputStream;
-import java.net.URL;
-import java.util.Properties;
-
-import static junit.framework.TestCase.assertTrue;
-import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertNotEquals;
-
-/**
- * Test copying parent/child associated documents between two indexes on same cluster
- */
-public class ElasticsearchReindexParentIT {
-
- private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchReindexIT.class);
-
- ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
-
- protected ElasticsearchReindexConfiguration testConfiguration;
- protected Client testClient;
-
- private int count = 0;
-
- @Before
- public void prepareTest() throws Exception {
-
- Config reference = ConfigFactory.load();
- File conf_file = new File("target/test-classes/ElasticsearchReindexParentIT.conf");
- assert(conf_file.exists());
- Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
- Properties es_properties = new Properties();
- InputStream es_stream = new FileInputStream("elasticsearch.properties");
- es_properties.load(es_stream);
- Config esProps = ConfigFactory.parseProperties(es_properties);
- Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
- StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
- testConfiguration = new ComponentConfigurator<>(ElasticsearchReindexConfiguration.class).detectConfiguration(typesafe);
- testClient = new ElasticsearchClientManager(testConfiguration.getSource()).getClient();
-
- ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
- ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
- assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
-
- IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getSource().getIndexes().get(0));
- IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
- assertTrue(indicesExistsResponse.isExists());
-
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getSource().getIndexes().get(0))
- .setTypes(testConfiguration.getSource().getTypes().get(0));
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- count = (int)countResponse.getHits().getTotalHits();
-
- PutIndexTemplateRequestBuilder putTemplateRequestBuilder = testClient.admin().indices().preparePutTemplate("mappings");
- URL templateURL = ElasticsearchParentChildWriterIT.class.getResource("/ActivityChildObjectParent.json");
- ObjectNode template = MAPPER.readValue(templateURL, ObjectNode.class);
- String templateSource = MAPPER.writeValueAsString(template);
- putTemplateRequestBuilder.setSource(templateSource);
-
- testClient.admin().indices().putTemplate(putTemplateRequestBuilder.request()).actionGet();
-
- assertNotEquals(count, 0);
-
- }
-
- @Test
- public void testReindex() throws Exception {
-
- ElasticsearchReindex reindex = new ElasticsearchReindex(testConfiguration);
-
- reindex.run();
-
- // assert lines in file
- SearchRequestBuilder countRequest = testClient
- .prepareSearch(testConfiguration.getDestination().getIndex())
- .setTypes(testConfiguration.getDestination().getType());
- SearchResponse countResponse = countRequest.execute().actionGet();
-
- assertEquals(count, (int)countResponse.getHits().getTotalHits());
-
- }
-
-}
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ReindexITs.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ReindexITs.java b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ReindexITs.java
deleted file mode 100644
index 9c46d31..0000000
--- a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/elasticsearch/test/ReindexITs.java
+++ /dev/null
@@ -1,20 +0,0 @@
-package org.apache.streams.example.elasticsearch.test;
-
-import org.apache.streams.elasticsearch.test.ElasticsearchParentChildWriterIT;
-import org.apache.streams.elasticsearch.test.ElasticsearchPersistWriterIT;
-import org.junit.runner.RunWith;
-import org.junit.runners.Suite;
-
-@RunWith(Suite.class)
-@Suite.SuiteClasses({
- ElasticsearchPersistWriterIT.class,
- ElasticsearchParentChildWriterIT.class,
- ElasticsearchReindexIT.class,
- ElasticsearchReindexParentIT.class,
- ElasticsearchReindexChildIT.class
-})
-
-public class ReindexITs {
- // the class remains empty,
- // used only as a holder for the above annotations
-}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/5b96588c/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexChildIT.java
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexChildIT.java b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexChildIT.java
new file mode 100644
index 0000000..47c8f51
--- /dev/null
+++ b/local/elasticsearch-reindex/src/test/java/org/apache/streams/example/test/ElasticsearchReindexChildIT.java
@@ -0,0 +1,121 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.apache.streams.example.test;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.typesafe.config.Config;
+import com.typesafe.config.ConfigFactory;
+import com.typesafe.config.ConfigParseOptions;
+import org.apache.streams.config.ComponentConfigurator;
+import org.apache.streams.config.StreamsConfiguration;
+import org.apache.streams.config.StreamsConfigurator;
+import org.apache.streams.elasticsearch.ElasticsearchClientManager;
+import org.apache.streams.example.ElasticsearchReindex;
+import org.apache.streams.example.ElasticsearchReindexConfiguration;
+import org.apache.streams.jackson.StreamsJacksonMapper;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest;
+import org.elasticsearch.action.admin.cluster.health.ClusterHealthResponse;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequest;
+import org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsResponse;
+import org.elasticsearch.action.search.SearchRequestBuilder;
+import org.elasticsearch.action.search.SearchResponse;
+import org.elasticsearch.client.Client;
+import org.elasticsearch.client.Requests;
+import org.elasticsearch.cluster.health.ClusterHealthStatus;
+import org.junit.Before;
+import org.junit.Test;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.InputStream;
+import java.util.Properties;
+
+import static junit.framework.TestCase.assertTrue;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotEquals;
+
+/**
+ * Test copying parent/child associated documents between two indexes on same cluster
+ */
+public class ElasticsearchReindexChildIT {
+
+ private final static Logger LOGGER = LoggerFactory.getLogger(ElasticsearchReindexIT.class);
+
+ ObjectMapper MAPPER = StreamsJacksonMapper.getInstance();
+
+ protected ElasticsearchReindexConfiguration testConfiguration;
+ protected Client testClient;
+
+ private int count = 0;
+
+ @Before
+ public void prepareTest() throws Exception {
+
+ Config reference = ConfigFactory.load();
+ File conf_file = new File("target/test-classes/ElasticsearchReindexChildIT.conf");
+ assert(conf_file.exists());
+ Config testResourceConfig = ConfigFactory.parseFileAnySyntax(conf_file, ConfigParseOptions.defaults().setAllowMissing(false));
+ Properties es_properties = new Properties();
+ InputStream es_stream = new FileInputStream("elasticsearch.properties");
+ es_properties.load(es_stream);
+ Config esProps = ConfigFactory.parseProperties(es_properties);
+ Config typesafe = testResourceConfig.withFallback(esProps).withFallback(reference).resolve();
+ StreamsConfiguration streams = StreamsConfigurator.detectConfiguration(typesafe);
+ testConfiguration = new ComponentConfigurator<>(ElasticsearchReindexConfiguration.class).detectConfiguration(typesafe);
+ testClient = new ElasticsearchClientManager(testConfiguration.getSource()).getClient();
+
+ ClusterHealthRequest clusterHealthRequest = Requests.clusterHealthRequest();
+ ClusterHealthResponse clusterHealthResponse = testClient.admin().cluster().health(clusterHealthRequest).actionGet();
+ assertNotEquals(clusterHealthResponse.getStatus(), ClusterHealthStatus.RED);
+
+ IndicesExistsRequest indicesExistsRequest = Requests.indicesExistsRequest(testConfiguration.getSource().getIndexes().get(0));
+ IndicesExistsResponse indicesExistsResponse = testClient.admin().indices().exists(indicesExistsRequest).actionGet();
+ assertTrue(indicesExistsResponse.isExists());
+
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getSource().getIndexes().get(0))
+ .setTypes(testConfiguration.getSource().getTypes().get(0));
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ count = (int)countResponse.getHits().getTotalHits();
+
+ assertNotEquals(count, 0);
+
+ }
+
+ @Test
+ public void testReindex() throws Exception {
+
+ ElasticsearchReindex reindex = new ElasticsearchReindex(testConfiguration);
+
+ reindex.run();
+
+ // assert lines in file
+ SearchRequestBuilder countRequest = testClient
+ .prepareSearch(testConfiguration.getDestination().getIndex())
+ .setTypes(testConfiguration.getDestination().getType());
+ SearchResponse countResponse = countRequest.execute().actionGet();
+
+ assertEquals(count, (int)countResponse.getHits().getTotalHits());
+
+ }
+
+}
[7/9] incubator-streams-examples git commit: configuration examples
in stream markdowns
Posted by sb...@apache.org.
configuration examples in stream markdowns
Project: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/commit/52b5f1ca
Tree: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/tree/52b5f1ca
Diff: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/diff/52b5f1ca
Branch: refs/heads/master
Commit: 52b5f1ca010c5f3366e0ab42417883076c9b2307
Parents: 3c1fbde
Author: Steve Blackmon @steveblackmon <sb...@apache.org>
Authored: Tue Oct 11 17:54:57 2016 -0500
Committer: Steve Blackmon @steveblackmon <sb...@apache.org>
Committed: Tue Oct 11 17:54:57 2016 -0500
----------------------------------------------------------------------
.../markdown/FlinkTwitterFollowingPipeline.md | 61 +++++++++++---------
.../site/markdown/FlinkTwitterPostsPipeline.md | 55 ++++++++++--------
.../markdown/FlinkTwitterSpritzerPipeline.md | 55 ++++++++++--------
.../FlinkTwitterUserInformationPipeline.md | 55 ++++++++++--------
.../src/site/markdown/index.md | 38 ++++--------
.../src/site/markdown/ElasticsearchHdfs.md | 23 +++++++-
.../src/site/markdown/HdfsElasticsearch.md | 27 +++++++--
.../src/site/markdown/index.md | 4 +-
.../src/site/markdown/ElasticsearchReindex.md | 26 +++++++--
.../src/site/markdown/index.md | 2 +-
.../src/site/markdown/MongoElasticsearchSync.md | 22 ++++++-
.../src/site/markdown/TwitterFollowNeo4j.md | 18 +++++-
.../src/site/markdown/index.md | 15 +----
.../markdown/TwitterHistoryElasticsearch.md | 25 ++++++--
.../src/site/markdown/index.md | 2 +-
.../markdown/TwitterUserstreamElasticsearch.md | 13 ++++-
.../src/site/markdown/index.md | 2 +-
src/site/markdown/credentials/twitter.md | 2 +-
18 files changed, 278 insertions(+), 167 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterFollowingPipeline.md
----------------------------------------------------------------------
diff --git a/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterFollowingPipeline.md b/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterFollowingPipeline.md
index 3ad23d3..f9f39e1 100644
--- a/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterFollowingPipeline.md
+++ b/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterFollowingPipeline.md
@@ -1,42 +1,51 @@
-FlinkTwitterFollowingPipeline
-=============================
+### FlinkTwitterFollowingPipeline
-Description:
------------------
+#### Description:
Collects twitter friends or followers with flink.
-Specification:
------------------
+#### Configuration:
+
+[TwitterFollowingPipelineConfiguration.json](TwitterFollowingPipelineConfiguration.json "TwitterFollowingPipelineConfiguration.json" )
+
+ include "flink.conf"
+ include "twitter.oauth.conf"
+ source {
+ fields = ["ID"]
+ scheme = file
+ path = "target/test-classes"
+ readerPath = "asf.txt"
+ }
+ destination {
+ fields = ["DOC"]
+ scheme = file
+ path = "target/test-classes"
+ writerPath = "FlinkTwitterFollowingPipelineFriendsIT"
+ }
+ twitter {
+ endpoint = friends
+ ids_only = true
+ }
+
+#### Run (Local):
-[FlinkTwitterFollowingPipeline.dot](FlinkTwitterFollowingPipeline.dot "FlinkTwitterFollowingPipeline.dot" )
-
-Diagram:
------------------
-
-![FlinkTwitterFollowingPipeline.dot.svg](./FlinkTwitterFollowingPipeline.dot.svg)
-
-Example Configuration:
-----------------------
+ java -cp dist/flink-twitter-collection-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.examples.flink.twitter.collection.FlinkTwitterFollowingPipeline
-[FlinkTwitterFollowingPipelineFollowersIT.conf](FlinkTwitterFollowingPipelineFollowersIT.conf "FlinkTwitterFollowingPipelineFollowersIT.conf" )
+#### Run (Flink):
-[FlinkTwitterFollowingPipelineFriendsIT.conf](FlinkTwitterFollowingPipelineFriendsIT.conf "FlinkTwitterFollowingPipelineFriendsIT.conf" )
+ flink-run.sh dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterFollowingPipeline http://<location_of_config_file>
-Run (Local):
-------------
+#### Run (YARN):
- java -cp dist/flink-twitter-collection-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.examples.flink.twitter.collection.FlinkTwitterFollowingPipeline
+ flink-run.sh yarn dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterFollowingPipeline http://<location_of_config_file>
-Run (Flink):
-------------
+#### Specification:
- flink-run.sh dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterFollowingPipeline http://<location_of_config_file>
+[FlinkTwitterFollowingPipeline.dot](FlinkTwitterFollowingPipeline.dot "FlinkTwitterFollowingPipeline.dot" )
-Run (YARN):
------------
+#### Diagram:
- flink-run.sh yarn dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterFollowingPipeline http://<location_of_config_file>
+![FlinkTwitterFollowingPipeline.dot.svg](./FlinkTwitterFollowingPipeline.dot.svg)
[JavaDocs](apidocs/index.html "JavaDocs")
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterPostsPipeline.md
----------------------------------------------------------------------
diff --git a/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterPostsPipeline.md b/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterPostsPipeline.md
index fe6b544..0b4c8bd 100644
--- a/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterPostsPipeline.md
+++ b/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterPostsPipeline.md
@@ -1,40 +1,47 @@
-FlinkTwitterPostsPipeline
-=========================
+### FlinkTwitterPostsPipeline
-Description:
------------------
+#### Description:
Collects twitter posts with flink.
-Specification:
------------------
+#### Configuration:
+
+[TwitterPostsPipelineConfiguration.json](TwitterPostsPipelineConfiguration.json "TwitterPostsPipelineConfiguration.json" )
+
+ include "flink.conf"
+ include "twitter.oauth.conf"
+ source {
+ fields = ["ID"]
+ scheme = file
+ path = "target/test-classes"
+ readerPath = "asf.txt"
+ }
+ destination {
+ fields = ["DOC"]
+ scheme = file
+ path = "target/test-classes"
+ writerPath = "FlinkTwitterPostsPipelineIT"
+ }
+
+#### Run (Local):
-[FlinkTwitterPostsPipeline.dot](FlinkTwitterPostsPipeline.dot "FlinkTwitterPostsPipeline.dot" )
-
-Diagram:
------------------
-
-![FlinkTwitterPostsPipeline.dot.svg](./FlinkTwitterPostsPipeline.dot.svg)
+ java -cp dist/flink-twitter-collection-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.examples.flink.twitter.collection.FlinkTwitterPostsPipeline
-Example Configuration:
-----------------------
+#### Run (Flink):
-[FlinkTwitterPostsPipelineIT.conf](FlinkTwitterPostsPipelineIT.conf "FlinkTwitterPostsPipelineIT.conf" )
+ flink-run.sh dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterPostsPipeline http://<location_of_config_file>
-Run (Local):
-------------
+#### Run (YARN):
- java -cp dist/flink-twitter-collection-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.examples.flink.twitter.collection.FlinkTwitterPostsPipeline
+ flink-run.sh yarn dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterPostsPipeline http://<location_of_config_file>
-Run (Flink):
-------------
+#### Specification:
- flink-run.sh dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterPostsPipeline http://<location_of_config_file>
+[FlinkTwitterPostsPipeline.dot](FlinkTwitterPostsPipeline.dot "FlinkTwitterPostsPipeline.dot" )
-Run (YARN):
------------
+#### Diagram:
- flink-run.sh yarn dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterPostsPipeline http://<location_of_config_file>
+![FlinkTwitterPostsPipeline.dot.svg](./FlinkTwitterPostsPipeline.dot.svg)
[JavaDocs](apidocs/index.html "JavaDocs")
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterSpritzerPipeline.md
----------------------------------------------------------------------
diff --git a/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterSpritzerPipeline.md b/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterSpritzerPipeline.md
index 1e59039..0a82321 100644
--- a/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterSpritzerPipeline.md
+++ b/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterSpritzerPipeline.md
@@ -1,40 +1,47 @@
-FlinkTwitterSpritzerPipeline
-============================
+### FlinkTwitterSpritzerPipeline
-Description:
------------------
+#### Description:
Collects twitter posts in real-time from the sample endpoint with flink.
-Specification:
------------------
+#### Configuration:
+
+[TwitterSpritzerPipelineConfiguration.json](TwitterSpritzerPipelineConfiguration.json "TwitterSpritzerPipelineConfiguration.json" )
+
+ include "flink.conf"
+ include "twitter.oauth.conf"
+ destination {
+ fields = ["DOC"]
+ scheme = file
+ path = "target/test-classes"
+ writerPath = "FlinkTwitterSpritzerPipelineIT"
+ }
+ twitter {
+ endpoint = sample
+ track = [
+ "data"
+ ]
+ }
+
+#### Run (Local):
-[FlinkTwitterSpritzerPipeline.dot](FlinkTwitterSpritzerPipeline.dot "FlinkTwitterSpritzerPipeline.dot" )
-
-Diagram:
------------------
-
-![FlinkTwitterSpritzerPipeline.dot.svg](./FlinkTwitterSpritzerPipeline.dot.svg)
+ java -cp dist/flink-twitter-collection-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.examples.flink.twitter.collection.FlinkTwitterSpritzerPipeline
-Example Configuration:
-----------------------
+#### Run (Flink):
-[FlinkTwitterSpritzerPipelineIT.conf](FlinkTwitterSpritzerPipelineIT.conf "FlinkTwitterSpritzerPipelineIT.conf" )
+ flink-run.sh dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterSpritzerPipeline http://<location_of_config_file>
-Run (Local):
-------------
+#### Run (YARN):
- java -cp dist/flink-twitter-collection-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.examples.flink.twitter.collection.FlinkTwitterSpritzerPipeline
+ flink-run.sh yarn dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterSpritzerPipeline http://<location_of_config_file>
-Run (Flink):
-------------
+#### Specification:
- flink-run.sh dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterSpritzerPipeline http://<location_of_config_file>
+[FlinkTwitterSpritzerPipeline.dot](FlinkTwitterSpritzerPipeline.dot "FlinkTwitterSpritzerPipeline.dot" )
-Run (YARN):
------------
+#### Diagram:
- flink-run.sh yarn dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterSpritzerPipeline http://<location_of_config_file>
+![FlinkTwitterSpritzerPipeline.dot.svg](./FlinkTwitterSpritzerPipeline.dot.svg)
[JavaDocs](apidocs/index.html "JavaDocs")
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterUserInformationPipeline.md
----------------------------------------------------------------------
diff --git a/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterUserInformationPipeline.md b/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterUserInformationPipeline.md
index a465de9..ad90fab 100644
--- a/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterUserInformationPipeline.md
+++ b/flink/flink-twitter-collection/src/site/markdown/FlinkTwitterUserInformationPipeline.md
@@ -1,40 +1,47 @@
-FlinkTwitterUserInformationPipeline
-===================================
+### FlinkTwitterUserInformationPipeline
-Description:
------------------
+#### Description:
Collects twitter user profiles with flink.
-Specification:
------------------
+#### Configuration:
+
+[TwitterUserInformationPipelineConfiguration.json](TwitterUserInformationPipelineConfiguration.json "TwitterUserInformationPipelineConfiguration.json" )
+
+ include "flink.conf"
+ include "twitter.oauth.conf"
+ source {
+ fields = ["ID"]
+ scheme = file
+ path = "target/test-classes"
+ readerPath = "1000twitterids.txt"
+ }
+ destination {
+ fields = ["DOC"]
+ scheme = file
+ path = "target/test-classes"
+ writerPath = "FlinkTwitterUserInformationPipelineIT"
+ }
+
+#### Run (Local):
-[FlinkTwitterUserInformationPipeline.dot](FlinkTwitterUserInformationPipeline.dot "FlinkTwitterUserInformationPipeline.dot" )
-
-Diagram:
------------------
-
-![TwitterUserInformationPipeline.dot.svg](./TwitterUserInformationPipeline.dot.svg)
+ java -cp dist/flink-twitter-collection-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.examples.flink.twitter.collection.FlinkTwitterUserInformationPipeline
-Example Configuration:
-----------------------
+#### Run (Flink):
-[FlinkTwitterUserInformationPipelineIT.conf](FlinkTwitterUserInformationPipelineIT.conf "FlinkTwitterUserInformationPipelineIT.conf" )
+ flink-run.sh dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterUserInformationPipeline http://<location_of_config_file>
-Run (Local):
-------------
+#### Run (YARN):
- java -cp dist/flink-twitter-collection-jar-with-dependencies.jar -Dconfig.file=file://<location_of_config_file> org.apache.streams.examples.flink.twitter.collection.FlinkTwitterUserInformationPipeline
+ flink-run.sh yarn dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterUserInformationPipeline http://<location_of_config_file>
-Run (Flink):
-------------
+#### Specification:
- flink-run.sh dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterUserInformationPipeline http://<location_of_config_file>
+[FlinkTwitterUserInformationPipeline.dot](FlinkTwitterUserInformationPipeline.dot "FlinkTwitterUserInformationPipeline.dot" )
-Run (YARN):
------------
+#### Diagram:
- flink-run.sh yarn dist/flink-twitter-collection-jar-with-dependencies.jar org.apache.streams.examples.flink.twitter.collection.FlinkTwitterUserInformationPipeline http://<location_of_config_file>
+![TwitterUserInformationPipeline.dot.svg](./TwitterUserInformationPipeline.dot.svg)
[JavaDocs](apidocs/index.html "JavaDocs")
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/flink/flink-twitter-collection/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/flink/flink-twitter-collection/src/site/markdown/index.md b/flink/flink-twitter-collection/src/site/markdown/index.md
index 24783be..616bdd7 100644
--- a/flink/flink-twitter-collection/src/site/markdown/index.md
+++ b/flink/flink-twitter-collection/src/site/markdown/index.md
@@ -1,20 +1,13 @@
-Apache Streams (incubating)
-Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
---------------------------------------------------------------------------------
+### flink-twitter-collection
-flink-twitter-collection
-========================
-
-Requirements:
--------------
+#### Requirements:
- Authorized Twitter API credentials
-Description:
-------------
+#### Description:
+
Collects large batches of documents from api.twitter.com from a seed set of ids.
-Streams:
---------
+#### Streams:
<a href="FlinkTwitterFollowingPipeline.html" target="_self">FlinkTwitterFollowingPipeline</a>
@@ -24,24 +17,15 @@ Streams:
<a href="FlinkTwitterUserInformationPipeline.html" target="_self">FlinkTwitterUserInformationPipeline</a>
-Test:
------
+#### Build:
+
+ mvn clean package
-Create a local file `application.conf` with valid twitter credentials
+#### Test:
- twitter {
- oauth {
- consumerKey = ""
- consumerSecret = ""
- accessToken = ""
- accessTokenSecret = ""
- }
- }
-
-Build:
----------
+Build with integration testing enabled, using your credentials
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=`pwd`/application.conf"
+ mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=./twitter.conf"
[JavaDocs](apidocs/index.html "JavaDocs")
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md b/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md
index ad8ad4a..3294e09 100644
--- a/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md
+++ b/local/elasticsearch-hdfs/src/site/markdown/ElasticsearchHdfs.md
@@ -6,8 +6,25 @@ Copies documents from elasticsearch to hdfs.
#### Configuration:
-[ElasticsearchHdfsIT.conf](ElasticsearchHdfsIT.conf "ElasticsearchHdfsIT.conf" )
-
+[ElasticsearchHdfs.json](ElasticsearchHdfs.json "ElasticsearchHdfs.json" )
+
+##### application.conf
+
+ include "elasticsearch.properties"
+ include "elasticsearch.conf"
+ source = ${elasticsearch}
+ source {
+ indexes += "elasticsearch_persist_writer_it"
+ types += "activity"
+ }
+ destination {
+ fields = ["ID","DOC"]
+ scheme = file
+ user = hadoop
+ path = "target/test-classes"
+ writerPath = "elasticsearch_hdfs_it"
+ }
+
#### Run (SBT):
sbtx -210 -sbt-create
@@ -15,7 +32,7 @@ Copies documents from elasticsearch to hdfs.
set libraryDependencies += "org.apache.streams" % "elasticsearch-hdfs" % "0.4-incubating-SNAPSHOT"
set fork := true
set javaOptions +="-Dconfig.file=application.conf"
- run elasticsearch-hdfs org.apache.streams.example.ElasticsearchHdfs
+ run org.apache.streams.example.ElasticsearchHdfs
#### Run (Docker):
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md b/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md
index 136b110..ecd8445 100644
--- a/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md
+++ b/local/elasticsearch-hdfs/src/site/markdown/HdfsElasticsearch.md
@@ -6,16 +6,35 @@ Copies documents from hdfs to elasticsearch.
#### Configuration:
-[HdfsElasticsearchIT.conf](HdfsElasticsearchIT.conf "HdfsElasticsearchIT.conf" )
-
+[HdfsElasticsearch.json](HdfsElasticsearch.json "HdfsElasticsearch.json" )
+
+##### application.conf
+
+ include "elasticsearch.properties"
+ include "elasticsearch.conf"
+ source {
+ fields = ["ID","DOC"]
+ scheme = file
+ user = hadoop
+ path = "target/test-classes"
+ readerPath = "elasticsearch_hdfs_it"
+ }
+ destination = ${elasticsearch}
+ destination {
+ index = "hdfs_elasticsearch_it"
+ type = "activity"
+ refresh = true
+ forceUseConfig = true
+ }
+
#### Run (SBT):
sbtx -210 -sbt-create
set resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
set libraryDependencies += "org.apache.streams" % "elasticsearch-hdfs" % "0.4-incubating-SNAPSHOT"
set fork := true
- set javaOptions +="-Dconfig.file=HdfsElasticsearchIT.conf"
- run elasticsearch-hdfs org.apache.streams.example.ElasticsearchHdfs
+ set javaOptions +="-Dconfig.file=application.conf"
+ run org.apache.streams.example.ElasticsearchHdfs
#### Run (Docker):
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/elasticsearch-hdfs/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-hdfs/src/site/markdown/index.md b/local/elasticsearch-hdfs/src/site/markdown/index.md
index d789a2f..4be1820 100644
--- a/local/elasticsearch-hdfs/src/site/markdown/index.md
+++ b/local/elasticsearch-hdfs/src/site/markdown/index.md
@@ -19,9 +19,9 @@ Start up elasticsearch with docker:
mvn -PdockerITs docker:start
-Build with integration testing enabled, using your credentials
+Build with integration testing enabled:
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=`pwd`/application.conf"
+ mvn clean test verify -DskipITs=false
Shutdown elasticsearch when finished:
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/elasticsearch-reindex/src/site/markdown/ElasticsearchReindex.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/site/markdown/ElasticsearchReindex.md b/local/elasticsearch-reindex/src/site/markdown/ElasticsearchReindex.md
index 2a2a6b2..9bd37d4 100644
--- a/local/elasticsearch-reindex/src/site/markdown/ElasticsearchReindex.md
+++ b/local/elasticsearch-reindex/src/site/markdown/ElasticsearchReindex.md
@@ -6,20 +6,36 @@ Copies documents into a different index
#### Configuration:
-[ElasticsearchReindexIT.conf](ElasticsearchReindexIT.conf "ElasticsearchReindexIT.conf" )
-
+[ElasticsearchReindex.json](ElasticsearchReindex.json "ElasticsearchReindex.json")
+
+##### application.conf
+
+ include "elasticsearch.properties"
+ include "elasticsearch.conf"
+ source = ${elasticsearch}
+ source {
+ indexes += "elasticsearch_persist_writer_it"
+ types += "activity"
+ }
+ destination = ${elasticsearch}
+ destination {
+ index: "elasticsearch_reindex_it",
+ type: "activity",
+ forceUseConfig": true
+ }
+
#### Run (SBT):
sbtx -210 -sbt-create
set resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
set libraryDependencies += "org.apache.streams" % "elasticsearch-reindex" % "0.4-incubating-SNAPSHOT"
set fork := true
- set javaOptions +="-Dconfig.file=ElasticsearchReindexIT.conf"
- run elasticsearch-hdfs org.apache.streams.example.ElasticsearchReindex
+ set javaOptions +="-Dconfig.file=application.conf"
+ run org.apache.streams.example.ElasticsearchReindex
#### Run (Docker):
- docker run elasticsearch-reindex java -cp elasticsearch-reindex-jar-with-dependencies.jar -Dconfig.file=`pwd`/HdfsElasticsearchIT.conf org.apache.streams.example.ElasticsearchReindex
+ docker run elasticsearch-reindex java -cp elasticsearch-reindex-jar-with-dependencies.jar -Dconfig.file=./application.conf org.apache.streams.example.ElasticsearchReindex
#### Specification:
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/elasticsearch-reindex/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/elasticsearch-reindex/src/site/markdown/index.md b/local/elasticsearch-reindex/src/site/markdown/index.md
index 87c3e04..66e4a92 100644
--- a/local/elasticsearch-reindex/src/site/markdown/index.md
+++ b/local/elasticsearch-reindex/src/site/markdown/index.md
@@ -19,7 +19,7 @@ Start up elasticsearch with docker:
mvn -PdockerITs docker:start
-Build with integration testing enabled, using your credentials
+Build with integration testing enabled:
mvn clean test verify -DskipITs=false
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/mongo-elasticsearch-sync/src/site/markdown/MongoElasticsearchSync.md
----------------------------------------------------------------------
diff --git a/local/mongo-elasticsearch-sync/src/site/markdown/MongoElasticsearchSync.md b/local/mongo-elasticsearch-sync/src/site/markdown/MongoElasticsearchSync.md
index cdbdce1..d1f8f8d 100644
--- a/local/mongo-elasticsearch-sync/src/site/markdown/MongoElasticsearchSync.md
+++ b/local/mongo-elasticsearch-sync/src/site/markdown/MongoElasticsearchSync.md
@@ -6,7 +6,25 @@ Copies documents from mongodb to elasticsearch
#### Configuration:
-[MongoElasticsearchSyncIT.conf](MongoElasticsearchSyncIT.conf "MongoElasticsearchSyncIT.conf" )
+[MongoElasticsearchSync.json](MongoElasticsearchSync.json "MongoElasticsearchSync.json") for _
+
+##### application.conf
+
+ include "mongo.properties"
+ include "mongo.conf"
+ include "elasticsearch.properties"
+ include "elasticsearch.conf"
+ source = ${mongo}
+ source {
+ db: mongo_persist_it
+ collection: activity
+ }
+ destination = ${elasticsearch}
+ destination {
+ index: mongo_elasticsearch_sync_it
+ type: activity
+ forceUseConfig": true
+ }
#### Run (SBT):
@@ -15,7 +33,7 @@ Copies documents from mongodb to elasticsearch
set libraryDependencies += "org.apache.streams" % "mongo-elasticsearch-sync" % "0.4-incubating-SNAPSHOT"
set fork := true
set javaOptions +="-Dconfig.file=application.conf"
- run mongo-elasticsearch-sync org.apache.streams.example.MongoElasticsearchSync
+ run org.apache.streams.example.MongoElasticsearchSync
#### Run (Docker):
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/twitter-follow-neo4j/src/site/markdown/TwitterFollowNeo4j.md
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/markdown/TwitterFollowNeo4j.md b/local/twitter-follow-neo4j/src/site/markdown/TwitterFollowNeo4j.md
index 936efb4..c241b60 100644
--- a/local/twitter-follow-neo4j/src/site/markdown/TwitterFollowNeo4j.md
+++ b/local/twitter-follow-neo4j/src/site/markdown/TwitterFollowNeo4j.md
@@ -6,7 +6,20 @@ Collects friend or follower connections for a set of twitter users to build a gr
#### Configuration:
-[TwitterFollowNeo4jIT.conf](TwitterFollowNeo4jIT.conf "TwitterFollowNeo4jIT.conf" )
+[TwitterFollowNeo4j.json](TwitterFollowNeo4j.json "TwitterFollowNeo4j.json") for _
+
+##### application.conf
+
+ include "neo4j.properties"
+ include "neo4j.conf"
+ include "twitter.oauth.conf"
+ twitter {
+ endpoint = "friends"
+ info = [
+ 18055613
+ ]
+ twitter.max_items = 1000
+ }
#### Run (SBT):
@@ -15,7 +28,7 @@ Collects friend or follower connections for a set of twitter users to build a gr
set libraryDependencies += "org.apache.streams" % "twitter-follow-neo4j" % "0.4-incubating-SNAPSHOT"
set fork := true
set javaOptions +="-Dconfig.file=application.conf"
- run org.apache.streams.example.graph.TwitterFollowNeo4j
+ run org.apache.streams.example.TwitterFollowNeo4j
#### Run (Docker):
@@ -29,5 +42,4 @@ Collects friend or follower connections for a set of twitter users to build a gr
![TwitterFollowNeo4j.dot.svg](./TwitterFollowNeo4j.dot.svg)
-
###### Licensed under Apache License 2.0 - http://www.apache.org/licenses/LICENSE-2.0
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/twitter-follow-neo4j/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/markdown/index.md b/local/twitter-follow-neo4j/src/site/markdown/index.md
index 3efdc5b..50a7456 100644
--- a/local/twitter-follow-neo4j/src/site/markdown/index.md
+++ b/local/twitter-follow-neo4j/src/site/markdown/index.md
@@ -10,28 +10,17 @@
#### Build:
- mvn clean package verify
+ mvn clean package
#### Test:
-Create a local file `application.conf` with valid twitter credentials
-
- twitter {
- oauth {
- consumerKey = ""
- consumerSecret = ""
- accessToken = ""
- accessTokenSecret = ""
- }
- }
-
Start up neo4j with docker:
mvn -PdockerITs docker:start
Build with integration testing enabled, using your credentials
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=`pwd`/application.conf"
+ mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=./twitter.conf"
Shutdown neo4j when finished:
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/twitter-history-elasticsearch/src/site/markdown/TwitterHistoryElasticsearch.md
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/site/markdown/TwitterHistoryElasticsearch.md b/local/twitter-history-elasticsearch/src/site/markdown/TwitterHistoryElasticsearch.md
index 9b696c2..09d7f5a 100644
--- a/local/twitter-history-elasticsearch/src/site/markdown/TwitterHistoryElasticsearch.md
+++ b/local/twitter-history-elasticsearch/src/site/markdown/TwitterHistoryElasticsearch.md
@@ -8,11 +8,26 @@ Converts them to activities, and writes them in activity format to Elasticsearch
#### Configuration:
-[TwitterHistoryElasticsearchIT.conf](TwitterHistoryElasticsearchIT.conf "TwitterHistoryElasticsearchIT.conf" )
-
-In the Twitter section you should place all of your relevant authentication keys and whichever Twitter IDs you want to pull history for.
-
-Twitter IDs can be converted from screennames at http://www.gettwitterid.com
+[TwitterHistoryElasticsearch.json](TwitterHistoryElasticsearch.json "TwitterHistoryElasticsearch.json") for _
+
+##### application.conf
+
+ include "elasticsearch.properties"
+ include "elasticsearch.conf"
+ include "twitter.oauth.conf"
+ twitter {
+ info = [
+ 18055613
+ ]
+ twitter.max_items = 1000
+ }
+ elasticsearch {
+ index = twitter_history
+ type = activity
+ forceUseConfig = true
+ }
+
+[TwitterHistoryElasticsearchIT.conf](TwitterHistoryElasticsearchIT.conf "TwitterHistoryElasticsearchIT.conf")
#### Run (SBT):
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/twitter-history-elasticsearch/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/site/markdown/index.md b/local/twitter-history-elasticsearch/src/site/markdown/index.md
index e737a12..28154cb 100644
--- a/local/twitter-history-elasticsearch/src/site/markdown/index.md
+++ b/local/twitter-history-elasticsearch/src/site/markdown/index.md
@@ -31,7 +31,7 @@ Start up elasticsearch with docker:
Build with integration testing enabled, using your credentials
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=`pwd`/application.conf"
+ mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=./twitter.conf"
Shutdown elasticsearch when finished:
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/twitter-userstream-elasticsearch/src/site/markdown/TwitterUserstreamElasticsearch.md
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/site/markdown/TwitterUserstreamElasticsearch.md b/local/twitter-userstream-elasticsearch/src/site/markdown/TwitterUserstreamElasticsearch.md
index 36f4244..c812749 100644
--- a/local/twitter-userstream-elasticsearch/src/site/markdown/TwitterUserstreamElasticsearch.md
+++ b/local/twitter-userstream-elasticsearch/src/site/markdown/TwitterUserstreamElasticsearch.md
@@ -6,7 +6,18 @@ This example connects to an active twitter account and stores the userstream as
#### Configuration:
-[TwitterUserstreamElasticsearchIT.conf](TwitterUserstreamElasticsearchIT.conf "TwitterUserstreamElasticsearchIT.conf" )
+[TwitterUserstreamElasticsearch.json](TwitterUserstreamElasticsearch.json "TwitterUserstreamElasticsearch.json") for _
+
+##### application.conf
+
+ include "elasticsearch.properties"
+ include "elasticsearch.conf"
+ include "twitter.oauth.conf"
+ elasticsearch {
+ index = twitter_userstream
+ type = activity
+ forceUseConfig = true
+ }
#### Run (SBT):
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/site/markdown/index.md b/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
index 833efde..10575d3 100644
--- a/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
+++ b/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
@@ -31,7 +31,7 @@ Start up elasticsearch with docker:
Build with integration testing enabled, using your credentials
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=`pwd`/application.conf"
+ mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=./twitter.conf"
Shutdown elasticsearch when finished:
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/52b5f1ca/src/site/markdown/credentials/twitter.md
----------------------------------------------------------------------
diff --git a/src/site/markdown/credentials/twitter.md b/src/site/markdown/credentials/twitter.md
index 098dabd..b716115 100644
--- a/src/site/markdown/credentials/twitter.md
+++ b/src/site/markdown/credentials/twitter.md
@@ -1,6 +1,6 @@
## Twitter Credentials
-Create a local file `twitter.conf` with valid twitter credentials
+Create a local file `twitter.oauth.conf` with valid twitter credentials
twitter {
oauth {
[9/9] incubator-streams-examples git commit: typos, tweaks
Posted by sb...@apache.org.
typos, tweaks
Project: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/commit/34c1a7be
Tree: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/tree/34c1a7be
Diff: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/diff/34c1a7be
Branch: refs/heads/master
Commit: 34c1a7be2b3d163a8e2b4b67cf5d5e2590b79197
Parents: 34c94b7
Author: Steve Blackmon @steveblackmon <sb...@apache.org>
Authored: Tue Oct 11 18:02:55 2016 -0500
Committer: Steve Blackmon @steveblackmon <sb...@apache.org>
Committed: Tue Oct 11 18:02:55 2016 -0500
----------------------------------------------------------------------
.../src/site/markdown/index.md | 2 +-
flink/src/site/markdown/flink.md | 2 +-
local/twitter-follow-neo4j/src/site/markdown/index.md | 2 +-
.../src/site/markdown/index.md | 2 +-
.../src/site/markdown/index.md | 13 +------------
5 files changed, 5 insertions(+), 16 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/34c1a7be/flink/flink-twitter-collection/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/flink/flink-twitter-collection/src/site/markdown/index.md b/flink/flink-twitter-collection/src/site/markdown/index.md
index 616bdd7..4d534e5 100644
--- a/flink/flink-twitter-collection/src/site/markdown/index.md
+++ b/flink/flink-twitter-collection/src/site/markdown/index.md
@@ -25,7 +25,7 @@ Collects large batches of documents from api.twitter.com from a seed set of ids.
Build with integration testing enabled, using your credentials
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=./twitter.conf"
+ mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=twitter.oauth.conf"
[JavaDocs](apidocs/index.html "JavaDocs")
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/34c1a7be/flink/src/site/markdown/flink.md
----------------------------------------------------------------------
diff --git a/flink/src/site/markdown/flink.md b/flink/src/site/markdown/flink.md
index ed96496..6926fa4 100644
--- a/flink/src/site/markdown/flink.md
+++ b/flink/src/site/markdown/flink.md
@@ -5,7 +5,7 @@ Create a local file `flink.conf`
local = true
test = true
-When configuring a stream, include this files:
+When configuring a stream, include this file:
include "flink.conf"
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/34c1a7be/local/twitter-follow-neo4j/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/src/site/markdown/index.md b/local/twitter-follow-neo4j/src/site/markdown/index.md
index 50a7456..aad8305 100644
--- a/local/twitter-follow-neo4j/src/site/markdown/index.md
+++ b/local/twitter-follow-neo4j/src/site/markdown/index.md
@@ -20,7 +20,7 @@ Start up neo4j with docker:
Build with integration testing enabled, using your credentials
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=./twitter.conf"
+ mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=twitter.oauth.conf"
Shutdown neo4j when finished:
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/34c1a7be/local/twitter-history-elasticsearch/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-history-elasticsearch/src/site/markdown/index.md b/local/twitter-history-elasticsearch/src/site/markdown/index.md
index 28154cb..a56819a 100644
--- a/local/twitter-history-elasticsearch/src/site/markdown/index.md
+++ b/local/twitter-history-elasticsearch/src/site/markdown/index.md
@@ -31,7 +31,7 @@ Start up elasticsearch with docker:
Build with integration testing enabled, using your credentials
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=./twitter.conf"
+ mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=twitter.oauth.conf"
Shutdown elasticsearch when finished:
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/34c1a7be/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
----------------------------------------------------------------------
diff --git a/local/twitter-userstream-elasticsearch/src/site/markdown/index.md b/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
index 10575d3..6e0b931 100644
--- a/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
+++ b/local/twitter-userstream-elasticsearch/src/site/markdown/index.md
@@ -14,24 +14,13 @@
#### Test:
-Create a local file `application.conf` with valid twitter credentials
-
- twitter {
- oauth {
- consumerKey = ""
- consumerSecret = ""
- accessToken = ""
- accessTokenSecret = ""
- }
- }
-
Start up elasticsearch with docker:
mvn -PdockerITs docker:start
Build with integration testing enabled, using your credentials
- mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=./twitter.conf"
+ mvn clean test verify -DskipITs=false -DargLine="-Dconfig.file=twitter.oauth.conf"
Shutdown elasticsearch when finished:
[8/9] incubator-streams-examples git commit: no jar files in source
control!
Posted by sb...@apache.org.
no jar files in source control!
Project: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/commit/34c94b73
Tree: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/tree/34c94b73
Diff: http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/diff/34c94b73
Branch: refs/heads/master
Commit: 34c94b73c17eb781dc463c894bfafbd80ffe6f0f
Parents: 52b5f1c
Author: Steve Blackmon @steveblackmon <sb...@apache.org>
Authored: Tue Oct 11 17:55:23 2016 -0500
Committer: Steve Blackmon @steveblackmon <sb...@apache.org>
Committed: Tue Oct 11 17:55:23 2016 -0500
----------------------------------------------------------------------
...itter-follow-graph-jar-with-dependencies.jar | Bin 25829072 -> 0 bytes
1 file changed, 0 insertions(+), 0 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/incubator-streams-examples/blob/34c94b73/local/twitter-follow-neo4j/dist/twitter-follow-graph-jar-with-dependencies.jar
----------------------------------------------------------------------
diff --git a/local/twitter-follow-neo4j/dist/twitter-follow-graph-jar-with-dependencies.jar b/local/twitter-follow-neo4j/dist/twitter-follow-graph-jar-with-dependencies.jar
deleted file mode 100644
index 758e5cf..0000000
Binary files a/local/twitter-follow-neo4j/dist/twitter-follow-graph-jar-with-dependencies.jar and /dev/null differ