You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@camel.apache.org by ac...@apache.org on 2020/09/23 06:02:20 UTC

[camel-kafka-connector-examples] 02/04: AWS2 S3 Sink with aggregation example: Referencing the correct secret file

This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/camel-kafka-connector-examples.git

commit d08effec93211abbca89b014f7e304dc010208dc
Author: Andrea Cosentino <an...@gmail.com>
AuthorDate: Wed Sep 23 07:57:41 2020 +0200

    AWS2 S3 Sink with aggregation example: Referencing the correct secret file
---
 aws2-s3/aws2-s3-sink-with-aggregation/README.adoc | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/aws2-s3/aws2-s3-sink-with-aggregation/README.adoc b/aws2-s3/aws2-s3-sink-with-aggregation/README.adoc
index 758e2f5..ae0e2ff 100644
--- a/aws2-s3/aws2-s3-sink-with-aggregation/README.adoc
+++ b/aws2-s3/aws2-s3-sink-with-aggregation/README.adoc
@@ -188,7 +188,7 @@ You should see something like this:
 
 ### Set the AWS credential as secret (optional)
 
-You can also set the aws creds option as secret, you'll need to edit the file config/aws-s3-cred.properties with the correct credentials and then execute the following command
+You can also set the aws creds option as secret, you'll need to edit the file config/aws2-s3-cred.properties with the correct credentials and then execute the following command
 
 [source,bash,options="nowrap"]
 ----