You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@geode.apache.org by nn...@apache.org on 2020/04/01 04:25:21 UTC

[geode-kafka-connector.wiki] branch master updated: Updated Deploying Kafka connect Geode on Confluent Platform (markdown)

This is an automated email from the ASF dual-hosted git repository.

nnag pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/geode-kafka-connector.wiki.git


The following commit(s) were added to refs/heads/master by this push:
     new 0cd12f6  Updated Deploying Kafka connect Geode on Confluent Platform (markdown)
0cd12f6 is described below

commit 0cd12f624690707d5478b58d28c4a9e100015390
Author: Nabarun Nag <na...@users.noreply.github.com>
AuthorDate: Tue Mar 31 21:25:16 2020 -0700

    Updated Deploying Kafka connect Geode on Confluent Platform (markdown)
---
 ...ng-Kafka-connect-Geode-on-Confluent-Platform.md | 55 +++++++++++++---------
 1 file changed, 32 insertions(+), 23 deletions(-)

diff --git a/Deploying-Kafka-connect-Geode-on-Confluent-Platform.md b/Deploying-Kafka-connect-Geode-on-Confluent-Platform.md
index 5f0d77d..d959c93 100644
--- a/Deploying-Kafka-connect-Geode-on-Confluent-Platform.md
+++ b/Deploying-Kafka-connect-Geode-on-Confluent-Platform.md
@@ -13,16 +13,17 @@ Build the connector jar using the command `mvn package`
 Install the Confluent Platform 
 Confluent on perm [Download Confluent Platform](https://www.confluent.io/download/?utm_medium=sem&utm_source=google&utm_campaign=ch.sem_br.brand_tp.prs_tgt.confluent-brand_mt.mbm_rgn.namer_lng.eng_dv.all&utm_term=%2Bconfluent%20%2Binstall&creative=&device=c&placement=&gclid=EAIaIQobChMI8JHD0svF6AIVgcBkCh3Q3QFlEAAYASAAEgLUbPD_BwE)
 
-Confluent Platform Documentation: [documentation link](https://docs.confluent.io/current/installation/installing_cp/index.html#installation)
+Confluent Platform Documentation: [documentation link](https://docs.confluent.io/current/quickstart/ce-quickstart.html#ce-quickstart)
 
 ### Step 4:
 Open terminal and navigate to share folder in the Confluent Platform directory
 
 ### Step 6:
-Create a new folder in the share folder called kafka-connect-geode.
+As a result of Step 2, there should be a folder with JARs created at  
+`./target/kafka-connect-geode-1.0-SNAPSHOT-package/share/java/kafka-connect-geode`
 
 ### Step 7:
-Copy the package generated in Step 1 to this shared folder
+Copy that folder to the `share/java` folder in Confluent directory
 
 ### Step 8:
 Download the Apache Geode 1.9.2 binaries from https://geode.apache.org/releases/. 
@@ -30,25 +31,31 @@ Ensure that you are downloading 1.9.2 Binaries.
 
 ### Step 9: 
 Extract the binaries and navigate to the `bin` folder and start the GemFire Shell (**_gfsh_**)
-
+![Starting gfsh](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/startgfsh.png)
 ### Step 10:
-Start the locators and servers.
-
+Start the locators and servers.  
+`start locator`  
+`start server`  
+![Start locator](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/startlocator.png)
+![Start server](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/startserver.png)
 ### Step 11:
 Create two regions 
 Region _gkcRegion_ to be used when we are using Apache Geode as a Source
-Region _gkcSinkRegion_ to be used when we are using Apache Geode as a Sink
-
+Region _gkcSinkRegion_ to be used when we are using Apache Geode as a Sink  
+`create region --name=gkcRegion --type=PARTITION`  
+`create region --name=gkcSinkRegion --type=PARTITION`
+![Create regions](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/createregion.png)
 ### Step 12:
-Start the Confluent Platform using the CLI
-`confluent local start`
-
+In a new terminal navigate to the Confluent Platform directory and start the Confluent Platform using the CLI  
+`confluent local start`  
+Ensure that you have properly installed the platform using the quickstart guide mentioned in step 3
+![Confluent Platform](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/confluenthubup.png)
 ### Step 13:
 Open a browser and navigate to the Confluent Command Center WebUI. [if running locally : `localhost:9021/clusters`]
-
+![Confluent Command Center](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/confluentCommandCenter.png)
 ### Step 14:
 Create topic using defaults called _gkcTopic_
-
+![Create Topic](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/createTopic.png)
 ### Step 15: 
 Navigate to Connect, select connect-default and Add Connector.
 We can see that the _GeodeKafkaSink_ and _GeodeKafkaSource_ Connectors are available in the list.
@@ -58,6 +65,7 @@ Start with the source connector. Click on the source connector and give it a nam
 _**Name**_: `Geode Source`  
 _**Key Converter Class**_ : `org.apache.kafka.connect.storage.StringConverter`  
 _**Value Converter Class**_ : `org.apache.kafka.connect.storage.StringConverter`  
+![Source 1](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/addconnector.png)
 
 ### Step 17:
 We use all the default settings for now. We now link the connector to the Apache Geode locator.  
@@ -66,7 +74,7 @@ _**locator**_: Address[port] of the Apache Geode locator. (If running locally wi
 ### Step 18:
 Set the Apache Geode region to topic binding. The region was created in Step 11 and the topic was created in Step 14  
  _**region-to-topics**_ : `[gkcRegion:gkcTopic]`
-
+![Source Mapping](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/locatorSourceMapping.png)
 ### Step 19:
 Launch the connector
 
@@ -87,16 +95,16 @@ _**locator**_: Address[port] of the Apache Geode locator. (If running locally wi
 
 ### Step 18b:
 Set the Apache Geode region to topic binding. The region was created in Step 11 and the topic was created in Step 14  
-_**region-to-topics**_ : `[gkcTopic:gkcSinkRegion]`
+_**topics-to-region**_ : `[gkcTopic:gkcSinkRegion]`
 
 ### Step 19b:
 Launch the connector. In the Connectors tab, you can see both the connectors up and running.
-
+![Connectors Running](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/connectorsRunning.png)
 ### Step 20:
-Open a terminal window and navigate to the Apache Geode extracted folder and inside bin directory.
-Start `gfsh`
-Connect to the locator 
-`connect` (if running locally)
+Open a terminal window and navigate to the Apache Geode extracted folder and inside bin directory.  
+Start `gfsh`  
+Connect to the locator  
+`connect` (if running locally)  
 `connect --locators=localhost[10334]` (replace localhost[10334] with the address of the locator you started)
 
 ### Step 21:
@@ -110,9 +118,10 @@ Put a region entry into the source region
 `put --region=gkcRegion --key="someKey" --value="someValue"`
 
 ### Step 23: 
-This data inserted into the source region `gkcRegion` with move through the Geode Source connector and land in the `gkcTopic` created in Kafka. Then this value will move from the topic to the Geode region called `gkcSinkregion` via the Geode Sink connector.  
+This data inserted into the source region `gkcRegion` will move through the Geode Source connector and land in the `gkcTopic` created in Kafka. Then this value will move from the topic to the Geode region called `gkcSinkregion` via the Geode Sink connector.  
 This can be verified by checking the contents of the gkcSinkRegion.  
 `query --query="SELECT * FROM /gkcSinkRegion`
-
+![Values to sink](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/valueToSink.png)
 ### Step 24:  
-This can also be verified using the Confluent Command Center under the messages tab at offset 0. 
\ No newline at end of file
+This can also be verified using the Confluent Command Center under the messages tab at offset 0.   
+![Values in sink](https://github.com/nabarunnag/kafkaConnectorScreenshots/blob/master/valuesinsinkCommandCenter.png)
\ No newline at end of file