You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@geode.apache.org by nn...@apache.org on 2020/11/30 23:04:18 UTC

[geode-kafka-connector] branch master updated: Correcting Kafka spelling (#4)

This is an automated email from the ASF dual-hosted git repository.

nnag pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/geode-kafka-connector.git


The following commit(s) were added to refs/heads/master by this push:
     new f47ce7f  Correcting Kafka spelling (#4)
f47ce7f is described below

commit f47ce7fe4e1d6fafbb8e62dba73dc7156a2ee82a
Author: Ashish Choudhary <aa...@gmail.com>
AuthorDate: Tue Dec 1 04:29:57 2020 +0530

    Correcting Kafka spelling (#4)
---
 pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/pom.xml b/pom.xml
index 1b3a3a2..8c6956b 100644
--- a/pom.xml
+++ b/pom.xml
@@ -223,7 +223,7 @@
                             <title>Kafka Connect Apache Geode</title>
                             <documentationUrl>https://geode.apache.org/docs/</documentationUrl>
                             <description>
-                            The Apache Geode connector can be used to move data from Kakfa to Geode and vice versa. The Sink take data from a Kafka topic and puts in a region in Geode, while the Source will move any data inserted into Geode region, to Kafka topics .
+                            The Apache Geode connector can be used to move data from Kafka to Geode and vice versa. The Sink take data from a Kafka topic and puts in a region in Geode, while the Source will move any data inserted into Geode region, to Kafka topics .
 
                                 Apache Geode is an in-memory data grid which stores data in a key-value format. When the Geode acts as a Sink, the key value pair is extracted from the Sink Record from the Kafka topic and that key-value pair is stored in Geode regions. When Geode acts as a Source, whenever a key-value pair is inserted into the region, an event is sent to connector containing the data. This data is then placed into the Kafka topic.