You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@camel.apache.org by ac...@apache.org on 2020/10/14 05:48:12 UTC

[camel-kafka-connector] branch master updated: Changes to documents - change version from 0.3.0-SNAPSHOT, give the right location of the properties file within docs/examples

This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/camel-kafka-connector.git


The following commit(s) were added to refs/heads/master by this push:
     new f881029  Changes to documents - change version from 0.3.0-SNAPSHOT, give the right location of the properties file within docs/examples
f881029 is described below

commit f881029674dc1dd53f8bcf61fd4b0154ca218a80
Author: Tom Cunningham <tc...@redhat.com>
AuthorDate: Tue Oct 13 21:29:43 2020 -0400

    Changes to documents - change version from 0.3.0-SNAPSHOT, give the right location of the properties file within docs/examples
---
 docs/modules/ROOT/pages/try-it-out-locally.adoc | 122 ++++++++++++------------
 1 file changed, 61 insertions(+), 61 deletions(-)

diff --git a/docs/modules/ROOT/pages/try-it-out-locally.adoc b/docs/modules/ROOT/pages/try-it-out-locally.adoc
index 0ab5677..f6c9ca7 100644
--- a/docs/modules/ROOT/pages/try-it-out-locally.adoc
+++ b/docs/modules/ROOT/pages/try-it-out-locally.adoc
@@ -34,7 +34,7 @@ $KAFKA_HOME/bin/kafka-topics.sh --create \
   --topic mytopic
 ----
 
-For using the quickstart we'll use the plugin.path property, so you'll have to add a path for your connectors.
+For using the quickstart we'll use the `plugin.path` property, so you'll have to add a path for your connectors.
 
 Open your configuration file located at `$KAFKA_HOME/config/connect-standalone.properties`
 
@@ -75,36 +75,36 @@ $KAFKA_HOME/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic m
 [[Tryitoutlocally-TryExamples]]
 == Try some examples
 
-For the following examples you need to fetch the `camel-kafka-connector` project and https://github.com/apache/camel-kafka-connector/blob/master/README.adoc#build-the-project[build] it locally by running `./mvnw package` from the root of the project. Look into the `config` and `examples` directories for the configuration files (`*.properties`) of the examples showcased here.
+For the following examples you need to fetch the `camel-kafka-connector` project and https://github.com/apache/camel-kafka-connector/blob/master/README.adoc#build-the-project[build] it locally by running `./mvnw package` from the root of the project. Look into the `config` and `docs/examples` directories for the configuration files (`*.properties`) of the examples showcased here.
 
 [[Tryitoutlocally-SimpleLogger]]
 === Simple logger (sink)
 
-First thing to do, is unzip or untar the camel-log-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-log-kafka-connector/target/` a .zip file named `camel-log-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-log-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-log-kafka-connector/target/` a .zip file named `camel-log-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-log-kafka-connector/target/camel-log-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-log-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-log-kafka-connector/target/camel-log-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-log-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 .Run the default sink, just a camel logger:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties config/CamelSinkConnector.properties 
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelSinkConnector.properties 
 ----
 
 [[Tryitoutlocally-Timer]]
 === Timer (source)
 
-First thing to do, is unzip or untar the camel-timer-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-timer-kafka-connector/target/` a .zip file named `camel-timer-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-timer-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-timer-kafka-connector/target/` a .zip file named `camel-timer-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-timer-kafka-connector/target/camel-log-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-timer-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-timer-kafka-connector/target/camel-log-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-timer-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This is an example of a _source_ that produces a message every second to `mytopic`.
@@ -112,19 +112,19 @@ This is an example of a _source_ that produces a message every second to `mytopi
 .Run the default source, just a camel timer:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties config/CamelSourceConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelSourceConnector.properties
 ----
 
 [[Tryitoutlocally-AwsKinesis]]
 === AWS Kinesis (source)
 
-First thing to do, is unzip or untar the camel-aws-kinesis-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-kinesis-kafka-connector/target/` a .zip file named `camel-aws-kinesis-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-kinesis-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-kinesis-kafka-connector/target/` a .zip file named `camel-aws-kinesis-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-aws-kinesis-kafka-connector/target/camel-aws-kinesis-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-aws-kinesis-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-aws-kinesis-kafka-connector/target/camel-aws-kinesis-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-aws-kinesis-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This example consumes from AWS Kinesis data stream and transfers the payload to `mytopic` topic in Kafka.
@@ -134,19 +134,19 @@ Adjust properties in `examples/CamelAWSKinesisSourceConnector.properties` for yo
 .Run the AWS Kinesis source:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelAWSKinesisSourceConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelAWSKinesisSourceConnector.properties
 ----
 
 [[Tryitoutlocally-AWSSQSSink]]
 === AWS SQS (sink)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-sqs-kafka-connector/target/` a .zip file named `camel-aws-sqs-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-sqs-kafka-connector/target/` a .zip file named `camel-aws-sqs-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-aws-sqs-kafka-connector/target/camel-aws-sqs-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-aws-sqs-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-aws-sqs-kafka-connector/target/camel-aws-sqs-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-aws-sqs-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This example consumes from Kafka topic `mytopic` and transfers the payload to AWS SQS.
@@ -156,19 +156,19 @@ Adjust properties in `examples/CamelAWSSQSSinkConnector.properties` for your env
 .Run the AWS SQS sink:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelAWSSQSSinkConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelAWSSQSSinkConnector.properties
 ----
 
 [[Tryitoutlocally-AWSSQSSource]]
 === AWS SQS (source)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-sqs-kafka-connector/target/` a .zip file named `camel-aws-sqs-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-sqs-kafka-connector/target/` a .zip file named `camel-aws-sqs-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-aws-sqs-kafka-connector/target/camel-aws-sqs-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-aws-sqs-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-aws-sqs-kafka-connector/target/camel-aws-sqs-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-aws-sqs-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This example consumes from AWS SQS queue `mysqs` and transfers the payload to `mytopic` topic in Kafka.
@@ -178,19 +178,19 @@ Adjust properties in `examples/CamelAWSSQSSourceConnector.properties` for your e
 .Run the AWS SQS source:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelAWSSQSSourceConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelAWSSQSSourceConnector.properties
 ----
 
 [[Tryitoutlocally-AWSSNSSink]]
 === AWS SNS (sink)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-sns-kafka-connector/target/` a .zip file named `camel-aws-sns-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-sns-kafka-connector/target/` a .zip file named `camel-aws-sns-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-aws-sns-kafka-connector/target/camel-aws-sns-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-aws-sns-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-aws-sns-kafka-connector/target/camel-aws-sns-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-aws-sns-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This example consumes from `mytopic` Kafka topic and transfers the payload to AWS SNS `topic` topic.
@@ -200,19 +200,19 @@ Adjust properties in `examples/CamelAWSSNSSinkConnector.properties` for your env
 .Run the AWS SNS sink:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelAWSSNSSinkConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelAWSSNSSinkConnector.properties
 ----
 
 [[Tryitoutlocally-AWSSNSSource]]
 === AWS S3 (source)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-s3-kafka-connector/target/` a .zip file named `camel-aws-s3-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-aws-s3-kafka-connector/target/` a .zip file named `camel-aws-s3-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-aws-s3-kafka-connector/target/camel-aws-s3-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-aws-s3-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-aws-s3-kafka-connector/target/camel-aws-s3-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-aws-s3-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This example fetches objects from AWS S3 in the `camel-kafka-connector` bucket and transfers the payload to `mytopic` Kafka topic. This example shows how to implement a custom converter converting from bytes received from S3 to Kafka's `SchemaAndValue`.
@@ -222,19 +222,19 @@ Adjust properties in `examples/CamelAWSS3SourceConnector.properties` for your en
 .Run the AWS S3 source:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelAWSS3SourceConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelAWSS3SourceConnector.properties
 ----
 
 [[Tryitoutlocally-CassandraQL]]
 === Apache Cassandra
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-cql-kafka-connector/target/` a .zip file named `camel-cql-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-cql-kafka-connector/target/` a .zip file named `camel-cql-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-cql-kafka-connector/target/camel-cql-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-cql-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-cql-kafka-connector/target/camel-cql-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-cql-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This examples require a running Cassandra instance, for simplicity the steps below show how to start Cassandra using Docker. First you'll need to run a Cassandra instance:
@@ -290,19 +290,19 @@ This example polls Cassandra via CSQL (`select * from users`) in the `test` keys
 .Run the Cassandra CQL source:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelCassandraQLSourceConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelCassandraQLSourceConnector.properties
 ----
 
 [[Tryitoutlocally-CassandraQLSink]]
 ==== Apache Cassandra (sink)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-cql-kafka-connector/target/` a .zip file named `camel-cql-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-cql-kafka-connector/target/` a .zip file named `camel-cql-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-cql-kafka-connector/target/camel-cql-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-cql-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-cql-kafka-connector/target/camel-cql-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-cql-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This example adds data to the `users` table in Cassandra from the data consumed from the `mytopic` Kafka topic. Notice how the `name` column is populated from the Kafka message using CQL command `insert into users...`.
@@ -310,22 +310,22 @@ This example adds data to the `users` table in Cassandra from the data consumed
 .Run the Cassandra CQL sink:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelCassandraQLSinkConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelCassandraQLSinkConnector.properties
 ----
 
 [[Tryitoutlocally-ElasticsearchSink]]
 === Elasticsearch (sink)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-elasticsearch-rest-kafka-connector/target/` a .zip file named `camel-elasticsearch-rest-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-elasticsearch-rest-kafka-connector/target/` a .zip file named `camel-elasticsearch-rest-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-elasticsearch-rest-kafka-connector/target/camel-elasticsearch-rest-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-elasticsearch-rest-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-elasticsearch-rest-kafka-connector/target/camel-elasticsearch-rest-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-elasticsearch-rest-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
-This example passes data from `mytopic` Kafka topic to `sampleIndexName` index in Elasticsearch. Adjust properties in `examples/CamelElasticSearchSinkConnector.properties` to reflect your environment, for example change the `hostAddresses` to a valid Elasticsearch instance hostname and port.
+This example passes data from `mytopic` Kafka topic to `sampleIndexName` index in Elasticsearch. Adjust properties in `docs/examples/CamelElasticSearchSinkConnector.properties` to reflect your environment, for example change the `hostAddresses` to a valid Elasticsearch instance hostname and port.
 
 For the index operation, it might be necessary to provide or implement a `transformer`. A sample configuration would be similar to the one below:
 
@@ -353,19 +353,19 @@ When the configuration is ready run the sink with:
 .Run the Elasticsearch sink:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelElasticSearchSinkConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelElasticSearchSinkConnector.properties
 ----
 
 [[Tryitoutlocally-FileSink]]
 === File (sink)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-file-kafka-connector/target/` a .zip file named `camel-file-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-file-kafka-connector/target/` a .zip file named `camel-file-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-file-kafka-connector/target/camel-file-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-file-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-file-kafka-connector/target/camel-file-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-file-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This example appends data from `mytopic` Kafka topic to a file in `/tmp/kafkaconnect.txt`.
@@ -373,39 +373,39 @@ This example appends data from `mytopic` Kafka topic to a file in `/tmp/kafkacon
 .Run the file sink:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelFileSinkConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelFileSinkConnector.properties
 ----
 
 [[Tryitoutlocally-HttpSink]]
 === HTTP (sink)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-http-kafka-connector/target/` a .zip file named `camel-http-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-http-kafka-connector/target/` a .zip file named `camel-http-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-http-kafka-connector/target/camel-http-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-http-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-http-kafka-connector/target/camel-http-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-http-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
-This example sends data from `mytopic` Kafka topic to a HTTP service. Adjust properties in `examples/CamelHttpSinkConnector.properties` for your environment, for example configuring the `camel.sink.url`. 
+This example sends data from `mytopic` Kafka topic to a HTTP service. Adjust properties in `docs/examples/CamelHttpSinkConnector.properties` for your environment, for example configuring the `camel.sink.url`. 
 
 .Run the http sink:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelHttpSinkConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelHttpSinkConnector.properties
 ----
 
 [[Tryitoutlocally-JMSSource]]
 === JMS (source)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-sjms2-kafka-connector/target/` a .zip file named `camel-sjsm2-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-sjms2-kafka-connector/target/` a .zip file named `camel-sjsm2-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-sjsm2-kafka-connector/target/camel-sjms2-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-sjsm2-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-sjsm2-kafka-connector/target/camel-sjms2-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-sjsm2-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 These are the basic connectors. For camel-sjms2 we have a bunch of provided dependencies we need to add in our path, so run the following commands:
@@ -425,7 +425,7 @@ This example receives messages from a JMS queue named `myqueue` and transfers th
 .Run the JMS source:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelJmsSourceConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelJmsSourceConnector.properties
 ----
 
 [[Tryitoutlocally-JMSSink]]
@@ -436,19 +436,19 @@ This example receives messages from `mytopic` Kafka topic and transfers them to
 .Run the JMS sink:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelJmsSinkConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelJmsSinkConnector.properties
 ----
 
 [[Tryitoutlocally-TelegramSource]]
 === Telegram (source)
 
-First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-telegram-kafka-connector/target/` a .zip file named `camel-telegram-kafka-connector-0.3.0-SNAPSHOT-package.zip`
+First thing to do, is unzip or untar the camel-aws-sqs-kafka-connector archive in the `plugin.path` location. After building the project you should have in `connectors/camel-telegram-kafka-connector/target/` a .zip file named `camel-telegram-kafka-connector-0.6.0-SNAPSHOT-package.zip`
 
 [source,bash]
 ----
 > cd /home/connectors/
-> cp connectors/camel-telegram-kafka-connector/target/camel-telegram-kafka-connector-0.3.0-SNAPSHOT-package.zip .
-> unzip camel-telegram-kafka-connector-0.3.0-SNAPSHOT-package.zip
+> cp connectors/camel-telegram-kafka-connector/target/camel-telegram-kafka-connector-0.6.0-SNAPSHOT-package.zip .
+> unzip camel-telegram-kafka-connector-0.6.0-SNAPSHOT-package.zip
 ----
 
 This example transfers messages sent to Telegram bot to the `mytopic` Kafka topic. Adjust to set telegram bot token in `examples/CamelTelegramSourceConnector.properties` to reflect your bot's token.
@@ -456,6 +456,6 @@ This example transfers messages sent to Telegram bot to the `mytopic` Kafka topi
 .Run the telegram source:
 [source,bash]
 ----
-$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties examples/CamelTelegramSourceConnector.properties
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties docs/examples/CamelTelegramSourceConnector.properties
 ----