You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@camel.apache.org by df...@apache.org on 2021/04/20 18:16:42 UTC
[camel-examples] 06/35: Added example for AWS2 S3 Stream mode
This is an automated email from the ASF dual-hosted git repository.
dfoulks pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/camel-examples.git
commit de8cd7a8cae739cb7ec2bd97367c0d334d6ce5db
Author: Andrea Cosentino <an...@gmail.com>
AuthorDate: Thu Apr 1 18:33:53 2021 +0200
Added example for AWS2 S3 Stream mode
---
.../main-endpointdsl-kafka-aws2-s3/README.adoc | 35 ++++++++++++++++++++--
.../src/main/resources/application.properties | 2 +-
2 files changed, 33 insertions(+), 4 deletions(-)
diff --git a/examples/main-endpointdsl-kafka-aws2-s3/README.adoc b/examples/main-endpointdsl-kafka-aws2-s3/README.adoc
index 88abc6e..f107f64 100644
--- a/examples/main-endpointdsl-kafka-aws2-s3/README.adoc
+++ b/examples/main-endpointdsl-kafka-aws2-s3/README.adoc
@@ -2,13 +2,27 @@
This example shows how to use the endpoint DSL in your Camel routes
to define endpoints using type safe fluent builders, which are Java methods
-that are compiled.
+that are compiled and it will show the AWS2-S3 stream mode.
-The example will poll an S3 bucket and send this to a Kafka topic.
+The example will poll two kafka topics (s3.topic.1 and s3.topic.2) and upload batch of 25 messages as single file into an s3 bucket (mycamel-1).
+
+On your bucket you'll see:
+
+s3.topic.1/s3.topic.1.txt
+s3.topic.1/s3.topic.1-1.txt
+
+s3.topic.2/s3.topic.2.txt
+s3.topic.2/s3.topic.2-1.txt
+
+and so on
+
+At the end you should have a total of 80 files.
Notice how you can configure Camel in the `application.properties` file.
-Don't forget to add your AWS Credentials and the bucket name and point to the correct topic.
+Don't forget to add your AWS Credentials and the bucket name (already created ahead of time) and point to the correct topic.
+You'll need also a running kafka broker.
+You'll need to have kafkacat installed.
=== How to run
@@ -16,9 +30,24 @@ You can run this example using
[source,sh]
----
+$ mvn compile
+----
+
+[source,sh]
+----
$ mvn camel:run
----
+Now run
+
+[source,sh]
+----
+$ data/burst.sh s3.topic.1 1000 msg.txt
+$ data/burst.sh s3.topic.2 1000 msg.txt
+----
+
+You should see the bucket populated.
+
=== Help and contributions
If you hit any problem using Camel or have some feedback, then please
diff --git a/examples/main-endpointdsl-kafka-aws2-s3/src/main/resources/application.properties b/examples/main-endpointdsl-kafka-aws2-s3/src/main/resources/application.properties
index 6ed7ff6..2ae5fc7 100644
--- a/examples/main-endpointdsl-kafka-aws2-s3/src/main/resources/application.properties
+++ b/examples/main-endpointdsl-kafka-aws2-s3/src/main/resources/application.properties
@@ -21,7 +21,7 @@ camel.main.name = Kafka-to-AWS2-S3-Stream
# properties used in the route
camel.component.aws2-s3.accessKey=xxxxx
-camel.component.aws2-s3.secretKey=yyyyy
+camel.component.aws2-s3.secretKey=yyyy
camel.component.aws2-s3.region=region
bucketName=mycamel-1