You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@camel.apache.org by ac...@apache.org on 2020/11/03 10:24:47 UTC

[camel-kafka-connector-examples] branch sql-sink created (now c9a6fe5)

This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a change to branch sql-sink
in repository https://gitbox.apache.org/repos/asf/camel-kafka-connector-examples.git.


      at c9a6fe5  Added a SQL Sink example

This branch includes the following new commits:

     new c9a6fe5  Added a SQL Sink example

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[camel-kafka-connector-examples] 01/01: Added a SQL Sink example

Posted by ac...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a commit to branch sql-sink
in repository https://gitbox.apache.org/repos/asf/camel-kafka-connector-examples.git

commit c9a6fe5d12397d41ba70446c2055ca34bee79a2a
Author: Andrea Cosentino <an...@gmail.com>
AuthorDate: Tue Nov 3 11:24:20 2020 +0100

    Added a SQL Sink example
---
 sql/sql-sink/README.adoc                           | 156 +++++++++++++++++++++
 .../config/CamelSqlSinkConnector.properties        |  30 ++++
 2 files changed, 186 insertions(+)

diff --git a/sql/sql-sink/README.adoc b/sql/sql-sink/README.adoc
new file mode 100644
index 0000000..c0dd1d2
--- /dev/null
+++ b/sql/sql-sink/README.adoc
@@ -0,0 +1,156 @@
+= Camel-Kafka-connector SQL Sink
+
+This is an example for Camel-Kafka-connector SQL Sink
+
+== Standalone
+
+=== What is needed
+
+- An AWS Account
+- A running postgresql instance through docker
+- Postgresql Jdbc Driver
+
+=== Running Kafka
+
+[source]
+----
+$KAFKA_HOME/bin/zookeeper-server-start.sh $KAFKA_HOME/config/zookeeper.properties
+$KAFKA_HOME/bin/kafka-server-start.sh $KAFKA_HOME/config/server.properties
+$KAFKA_HOME/bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic mytopic
+----
+
+=== Download the connector package
+
+Download the connector package zip and extract the content to a directory. In this example we'll use `/home/oscerd/connectors/`
+
+[source]
+----
+> cd /home/oscerd/connectors/
+> wget https://repo1.maven.org/maven2/org/apache/camel/kafkaconnector/camel-sql-kafka-connector/0.6.0/camel-sql-kafka-connector-0.6.0-package.zip
+> unzip camel-sql-kafka-connector-0.6.0-package.zip
+----
+
+There is also the need of the driver for this example
+
+[source]
+----
+> cd /home/oscerd/connectors/camel-sql-kafka-connector/
+> wget https://repo1.maven.org/maven2/org/postgresql/postgresql/42.2.4/postgresql-42.2.4.jar
+----
+
+=== Configuring Kafka Connect
+
+You'll need to set up the `plugin.path` property in your kafka
+
+Open the `$KAFKA_HOME/config/connect-standalone.properties` and set the `plugin.path` property to your choosen location:
+
+[source]
+----
+...
+plugin.path=/home/oscerd/connectors
+...
+----
+
+=== Setup the docker image
+
+We'll need a full running Postgresql instance.
+
+First step is running it:
+
+[source]
+----
+> docker run --name some-postgres -e POSTGRES_PASSWORD=mysecretpassword -d postgres
+6cd4ba4696f2e8872f3787faaa8d03d1dae5cb5f22986648adf132823f3690eb
+----
+
+Take note of the container id.
+We need now to create the table we'll use: the table is the following
+
+[source]
+----
+CREATE TABLE accounts (
+	user_id serial PRIMARY KEY,
+	username VARCHAR ( 50 ) UNIQUE NOT NULL,
+	city VARCHAR ( 50 ) NOT NULL
+);
+----
+
+We are now ready to create the table
+
+[source]
+----
+> docker exec -it 6cd4ba4696f2e8872f3787faaa8d03d1dae5cb5f22986648adf132823f3690eb psql -U postgres
+psql (13.0 (Debian 13.0-1.pgdg100+1))
+Type "help" for help.
+
+postgres=# CREATE TABLE accounts (
+postgres(# user_id serial PRIMARY KEY,
+postgres(# username VARCHAR ( 50 ) UNIQUE NOT NULL,
+postgres(# city VARCHAR ( 50 ) NOT NULL
+postgres(# );
+----
+
+We need to take note also of the container ip
+
+----
+> docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' 6cd4ba4696f2e8872f3787faaa8d03d1dae5cb5f22986648adf132823f3690eb
+172.17.0.2
+----
+
+=== Setup the connectors
+
+Open the SQL configuration file at `$EXAMPLES/sql/sql-sink/config/CamelSqlSinkConnector.properties`
+
+[source]
+----
+name=CamelSqlSinkConnector
+connector.class=org.apache.camel.kafkaconnector.sql.CamelSqlSinkConnector
+key.converter=org.apache.kafka.connect.storage.StringConverter
+value.converter=org.apache.kafka.connect.storage.StringConverter
+
+topics=mytopic
+
+camel.component.sql.dataSource.user=postgres
+camel.component.sql.dataSource.password=mysecretpassword
+camel.component.sql.dataSource.serverName=172.17.0.2
+camel.component.sql.dataSource=#class:org.postgresql.ds.PGSimpleDataSource
+
+camel.sink.path.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)
+----
+
+and add the correct IP for the container.
+
+=== Running the example
+
+Run the kafka connect with the SQL Sink connector:
+
+[source]
+----
+$KAFKA_HOME/bin/connect-standalone.sh $KAFKA_HOME/config/connect-standalone.properties $EXAMPLES/sql/sql-sink/config/CamelSqlSinkConnector.properties
+----
+
+On a different terminal run the kafkacat producer and send the following message
+
+[source]
+----
+> echo "test" | ./kafkacat -b localhost:9092 -t dbtest1 -H "CamelHeader.username=andrea" -H "CamelHeader.city=Roma"
+% Auto-selecting Producer mode (use -P or -C to override)
+> echo "test" | ./kafkacat -b localhost:9092 -t dbtest1 -H "CamelHeader.username=John" -H "CamelHeader.city=New York"
+% Auto-selecting Producer mode (use -P or -C to override)
+----
+
+Now you can search through the psql command the record inserted
+
+[source]
+----
+> docker exec -it 6cd4ba4696f2e8872f3787faaa8d03d1dae5cb5f22986648adf132823f3690eb psql -U postgres
+psql (13.0 (Debian 13.0-1.pgdg100+1))
+Type "help" for help.
+
+postgres=# select * from accounts;
+ user_id | username |   city   
+---------+----------+----------
+       1 | andrea   | Roma
+       2 | John     | New York
+(2 rows)
+----
diff --git a/sql/sql-sink/config/CamelSqlSinkConnector.properties b/sql/sql-sink/config/CamelSqlSinkConnector.properties
new file mode 100644
index 0000000..cdd42ab
--- /dev/null
+++ b/sql/sql-sink/config/CamelSqlSinkConnector.properties
@@ -0,0 +1,30 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+name=CamelSqlSinkConnector
+connector.class=org.apache.camel.kafkaconnector.sql.CamelSqlSinkConnector
+key.converter=org.apache.kafka.connect.storage.StringConverter
+value.converter=org.apache.kafka.connect.storage.StringConverter
+
+topics=mytopic
+
+camel.component.sql.dataSource.user=postgres
+camel.component.sql.dataSource.password=mysecretpassword
+camel.component.sql.dataSource.serverName=172.17.0.2
+camel.component.sql.dataSource=#class:org.postgresql.ds.PGSimpleDataSource
+
+camel.sink.path.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)