You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@camel.apache.org by ac...@apache.org on 2021/08/05 05:24:00 UTC

[camel-k-examples] branch main updated (23d05e0 -> a2cb78c)

This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/camel-k-examples.git.


    from 23d05e0  Added Example to list
     new 03f081c  Kafka to SQLServer Skeleton
     new a7b6f5a  Kafka to SQL Server Example: Improved README
     new b04c76f  Kafka to SQL Server Example: Added to example list
     new a2cb78c  Examples reordering

The 4 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 kamelets/README.md                                 |  3 +-
 kamelets/kafka-to-sqlserver/README.md              | 60 ++++++++++++++++++++++
 kamelets/kafka-to-sqlserver/flow-binding.yaml      | 28 ++++++++++
 .../log-sink.kamelet.yaml                          |  0
 4 files changed, 90 insertions(+), 1 deletion(-)
 create mode 100644 kamelets/kafka-to-sqlserver/README.md
 create mode 100644 kamelets/kafka-to-sqlserver/flow-binding.yaml
 copy kamelets/{postgresql-to-log => kafka-to-sqlserver}/log-sink.kamelet.yaml (100%)

[camel-k-examples] 01/04: Kafka to SQLServer Skeleton

Posted by ac...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/camel-k-examples.git

commit 03f081cbe756756f9b4466f9c345ad94d6aec6b5
Author: Andrea Cosentino <an...@gmail.com>
AuthorDate: Thu Aug 5 07:09:42 2021 +0200

    Kafka to SQLServer Skeleton
---
 kamelets/kafka-to-sqlserver/README.md             | 18 +++++++++++++++
 kamelets/kafka-to-sqlserver/flow-binding.yaml     | 28 +++++++++++++++++++++++
 kamelets/kafka-to-sqlserver/log-sink.kamelet.yaml | 22 ++++++++++++++++++
 3 files changed, 68 insertions(+)

diff --git a/kamelets/kafka-to-sqlserver/README.md b/kamelets/kafka-to-sqlserver/README.md
new file mode 100644
index 0000000..af26143
--- /dev/null
+++ b/kamelets/kafka-to-sqlserver/README.md
@@ -0,0 +1,18 @@
+# Kafka to Kafka with Regex Router
+
+- Use the quickstart for https://strimzi.io/quickstarts/ and follow the minikube guide.
+
+- The Log Sink Kamelet is not available out of the box in 1.5.0 Camel-K release so you'll have to install it before installing the flow binding.
+
+- If camel-k has been installed in a specific namespace different from the default one, you'll need to add a parameter to all the commands (-n <namespace_name>)
+
+- Run the following commands
+
+  - kubectl apply -f log-sink.kamelet.yaml -n kafka
+  - kubectl apply -f flow-binding.yaml -n kafka
+
+- Check logs
+
+kamel logs kafka-to-kafka-with-regex-router 
+
+You should data ingesting into the topic-1 topic, after regex router override the topic name.
diff --git a/kamelets/kafka-to-sqlserver/flow-binding.yaml b/kamelets/kafka-to-sqlserver/flow-binding.yaml
new file mode 100644
index 0000000..aa156fb
--- /dev/null
+++ b/kamelets/kafka-to-sqlserver/flow-binding.yaml
@@ -0,0 +1,28 @@
+apiVersion: camel.apache.org/v1alpha1
+kind: KameletBinding
+metadata:
+  name: kafka-to-sqlserver
+spec:
+  integration:
+    dependencies:
+    - "mvn:com.microsoft.sqlserver:mssql-jdbc:9.2.1.jre11"
+  source:
+    ref:
+      kind: Kamelet
+      apiVersion: camel.apache.org/v1alpha1
+      name: kafka-not-secured-source
+    properties:
+      brokers: 'my-cluster-kafka-bootstrap:9092'
+      topic: 'test-topic-1'
+  sink:
+    ref:
+      kind: Kamelet
+      apiVersion: camel.apache.org/v1alpha1
+      name: sqlserver-sink
+    properties:
+      serverName: 172.17.0.10
+      username: sa
+      password: Password!
+      query: 'INSERT INTO master.dbo.accounts (user_id,username,city) VALUES (:#user_id,:#username,:#city)'
+      port: 1433
+      databaseName: master
diff --git a/kamelets/kafka-to-sqlserver/log-sink.kamelet.yaml b/kamelets/kafka-to-sqlserver/log-sink.kamelet.yaml
new file mode 100755
index 0000000..a48fb41
--- /dev/null
+++ b/kamelets/kafka-to-sqlserver/log-sink.kamelet.yaml
@@ -0,0 +1,22 @@
+apiVersion: camel.apache.org/v1alpha1
+kind: Kamelet
+metadata:
+  name: log-sink
+  annotations:
+    camel.apache.org/kamelet.icon: "data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHZpZXdCb3g9IjAgMCAyNDAgMjQwIj48ZGVmcz48bGluZWFyR3JhZGllbnQgaWQ9ImEiIHgxPSIuNjY3IiB4Mj0iLjQxNyIgeTE9Ii4xNjciIHkyPSIuNzUiPjxzdG9wIG9mZnNldD0iMCIgc3RvcC1jb2xvcj0iIzM3YWVlMiIvPjxzdG9wIG9mZnNldD0iMSIgc3RvcC1jb2xvcj0iIzFlOTZjOCIvPjwvbGluZWFyR3JhZGllbnQ+PGxpbmVhckdyYWRpZW50IGlkPSJiIiB4MT0iLjY2IiB4Mj0iLjg1MSIgeTE9Ii40MzciIHkyPSIuODAyIj48c3RvcCBvZmZzZXQ9IjAiIHN0b3AtY29sb3I9IiNlZmY3Zm [...]
+    camel.apache.org/provider: "Apache Software Foundation"
+  labels:
+    camel.apache.org/kamelet.type: "sink"
+    camel.apache.org/kamelet.group: "Log"
+spec:
+  definition:
+    title: "Log Sink"
+    description: |-
+      Log something
+    type: object
+  flow:
+    from:
+      uri: "kamelet:source"
+      steps:
+      - to:
+          uri: "log:info?showAll=true"

[camel-k-examples] 04/04: Examples reordering

Posted by ac...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/camel-k-examples.git

commit a2cb78c7aacef3eb6c2650bf55684de2a7bfbfee
Author: Andrea Cosentino <an...@gmail.com>
AuthorDate: Thu Aug 5 07:23:15 2021 +0200

    Examples reordering
---
 kamelets/README.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/kamelets/README.md b/kamelets/README.md
index e2ba0bd..81b5c3a 100644
--- a/kamelets/README.md
+++ b/kamelets/README.md
@@ -15,10 +15,10 @@ All the Kamelet examples in this folder have been tested on Camel-K 1.5.0.
 - [AWS S3 to Log](./aws-s3-to-log): Create a Kamelet binding between an AWS S3 Source Kamelet and a Log Sink Kamelet
 - [AWS S3 to Log with Secret](./aws-s3-to-log-with-secret): Create a Kamelet binding between an AWS S3 Source Kamelet and a Log Sink Kamelet and define S3 credentials through Kubernetes secret
 - [AWS S3 to Kafka with Timestamp router](./aws-s3-to-kafka-with-timestamp-router): Create a Kamelet binding between an AWS S3 Source Kamelet and a Kafka Sink Kamelet, with the usage of the Timestamp Router Action.
+- [Kafka to AWS S3 Streaming Upload](./kafka-to-s3-streaming-upload): Create a Kamelet binding between a Kafka Source Kamelet and a AWS S3 Streaming Upload Sink Kamelet.
 - [Kafka to Kafka with Regex router](./kafka-to-kafka-with-regex-router): Create a Kamelet binding between a Kafka Source Kamelet and a Kafka Sink Kamelet, with the usage of the Regex Router Action.
 - [Kafka to Kafka with Manual commit](./kafka-to-kafka-with-manual-commit): Create a Kamelet binding between a Kafka Source Kamelet and a Kafka Sink Kamelet, with the usage of the Manual Commit Action.
 - [Kafka to Kafka with Timestamp router](./kafka-to-kafka-with-timestamp-router): Create a Kamelet binding between a Kafka Source Kamelet and a Kafka Sink Kamelet, with the usage of the Timestamp Router Action.
 - [Kafka to Log with Value to Key](./kafka-to-log-with-value-to-key): Create a Kamelet binding between a Kafka Source Kamelet and a Log Sink Kamelet, with the usage of the Value to Key Action.
-- [Kafka to AWS S3 Streaming Upload](./kafka-to-s3-streaming-upload): Create a Kamelet binding between a Kafka Source Kamelet and a AWS S3 Streaming Upload Sink Kamelet.
 - [Kafka to SQL Server](./kafka-to-sqlserver): Create a Kamelet binding between a Kafka Source Kamelet and a SQL Sink Kamelet.
 - [PostgreSQL to Log](./postgresql-to-log): Create a Kamelet binding between a PostgreSQL Source Kamelet and a Log Sink Kamelet.

[camel-k-examples] 03/04: Kafka to SQL Server Example: Added to example list

Posted by ac...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/camel-k-examples.git

commit b04c76ff842c2d2584e7b2b7e1fd04f0e86c9aab
Author: Andrea Cosentino <an...@gmail.com>
AuthorDate: Thu Aug 5 07:22:55 2021 +0200

    Kafka to SQL Server Example: Added to example list
---
 kamelets/README.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/kamelets/README.md b/kamelets/README.md
index 9d46b4f..e2ba0bd 100644
--- a/kamelets/README.md
+++ b/kamelets/README.md
@@ -20,4 +20,5 @@ All the Kamelet examples in this folder have been tested on Camel-K 1.5.0.
 - [Kafka to Kafka with Timestamp router](./kafka-to-kafka-with-timestamp-router): Create a Kamelet binding between a Kafka Source Kamelet and a Kafka Sink Kamelet, with the usage of the Timestamp Router Action.
 - [Kafka to Log with Value to Key](./kafka-to-log-with-value-to-key): Create a Kamelet binding between a Kafka Source Kamelet and a Log Sink Kamelet, with the usage of the Value to Key Action.
 - [Kafka to AWS S3 Streaming Upload](./kafka-to-s3-streaming-upload): Create a Kamelet binding between a Kafka Source Kamelet and a AWS S3 Streaming Upload Sink Kamelet.
+- [Kafka to SQL Server](./kafka-to-sqlserver): Create a Kamelet binding between a Kafka Source Kamelet and a SQL Sink Kamelet.
 - [PostgreSQL to Log](./postgresql-to-log): Create a Kamelet binding between a PostgreSQL Source Kamelet and a Log Sink Kamelet.

[camel-k-examples] 02/04: Kafka to SQL Server Example: Improved README

Posted by ac...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/camel-k-examples.git

commit a7b6f5a039b2f7fda1951d7ceb4aa813ef4d8913
Author: Andrea Cosentino <an...@gmail.com>
AuthorDate: Thu Aug 5 07:20:10 2021 +0200

    Kafka to SQL Server Example: Improved README
---
 kamelets/kafka-to-sqlserver/README.md | 58 ++++++++++++++++++++++++++++++-----
 1 file changed, 50 insertions(+), 8 deletions(-)

diff --git a/kamelets/kafka-to-sqlserver/README.md b/kamelets/kafka-to-sqlserver/README.md
index af26143..cfe2972 100644
--- a/kamelets/kafka-to-sqlserver/README.md
+++ b/kamelets/kafka-to-sqlserver/README.md
@@ -1,18 +1,60 @@
-# Kafka to Kafka with Regex Router
+# Kafka to SQL Server
 
 - Use the quickstart for https://strimzi.io/quickstarts/ and follow the minikube guide.
 
-- The Log Sink Kamelet is not available out of the box in 1.5.0 Camel-K release so you'll have to install it before installing the flow binding.
-
 - If camel-k has been installed in a specific namespace different from the default one, you'll need to add a parameter to all the commands (-n <namespace_name>)
 
+- Run the following command
+
+    > kubectl run mssql-1 --image=mcr.microsoft.com/mssql/server:2017-latest --port=1433 --env 'ACCEPT_EULA=Y' --env 'SA_PASSWORD=Password!' -n kafka
+
+- Once the pod is up and running we'll need to create the table and populate it with same starting data
+
+    > kubectl -n kafka exec -it mssql-1 -- bash
+    > root@mssql-1:/# /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P "Password!"
+    1> CREATE TABLE accounts (user_id INT PRIMARY KEY, username VARCHAR ( 50 ) UNIQUE NOT NULL, city VARCHAR ( 50 ) NOT NULL );
+    2> GO
+    1> INSERT into accounts (user_id,username,city) VALUES (1, 'andrea', 'Roma');
+    2> GO
+    1> INSERT into accounts (user_id,username,city) VALUES (2, 'John', 'New York');
+    2> GO
+
+- So we now have two rows in the database.
+
+- Open a different terminal
+
+- Add the correct credentials and container address in the flow-binding yaml for the MSSQL Server database.
+
 - Run the following commands
 
-  - kubectl apply -f log-sink.kamelet.yaml -n kafka
-  - kubectl apply -f flow-binding.yaml -n kafka
+    kubectl apply -f flow-binding.yaml -n kafka
+
+- Open a different terminal and run the following command
+
+    kubectl -n kafka run kafka-producer -ti --image=quay.io/strimzi/kafka:0.24.0-kafka-2.8.0 --rm=true --restart=Never -- bin/kafka-console-producer.sh --broker-list my-cluster-kafka-bootstrap:9092 --topic test-topic-1
+
+- Send some messages to the kafka topic like for example
+
+    { "user_id":"3", "username":"Vittorio", "city":"Roma" } 
+    { "user_id":"4", "username":"Hugo", "city":"Paris" } 
+
+- Now we can check the database
+
+    > kubectl exec -it mssql-1 -- bash
+    > root@mssql-1:/# /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P "Password!"
+    1> SELECT * from accounts;
+    2> GO
+    user_id     username                                           city                                              
+    ----------- -------------------------------------------------- --------------------------------------------------
+                1 andrea                                             Roma                                              
+                2 John                                               New York                                          
+                3 Vittorio                                           Roma
+                4 Hugo                                               Paris                                               
+
+    (4 rows affected)
+
+- Check logs to see the integration running
 
-- Check logs
+    kamel logs kafka-to-sqlserver 
 
-kamel logs kafka-to-kafka-with-regex-router 
 
-You should data ingesting into the topic-1 topic, after regex router override the topic name.