You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@druid.apache.org by GitBox <gi...@apache.org> on 2020/02/06 21:21:10 UTC

[GitHub] [druid] clintropolis commented on a change in pull request #9320: fix protobuf extension packaging and docs

clintropolis commented on a change in pull request #9320: fix protobuf extension packaging and docs
URL: https://github.com/apache/druid/pull/9320#discussion_r376088414
 
 

 ##########
 File path: docs/development/extensions-core/protobuf.md
 ##########
 @@ -165,54 +164,74 @@ Please make sure these keys are properly configured for successful ingestion.
 }
 ```
 
-## Kafka Producer
+## Adding Protobuf messages to Kafka
 
-Here is the sample script that publishes the metrics to Kafka in Protobuf format.
+If necessary, from your Kafka installation directory run the following command to create the Kafka topic
 
-1. Run `protoc` again with the Python binding option.  This command generates `metrics_pb2.py` file.
- ```
-  protoc -o metrics.desc metrics.proto --python_out=.
- ```
+```
+./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic metrics_pb
+```
 
-2. Create Kafka producer script.
+This example script requires `protobuf` and `kafka-python` modules. With the topic in place, messages can be inserted running the following command from your Druid installation directory
 
-This script requires `protobuf` and `kafka-python` modules.
+```
+./bin/generate-example-metrics | ./quickstart/protobuf/pb_publisher.py
+```
 
-```python
-#!/usr/bin/env python
+You can confirm that data has been inserted to your Kafka topic using the following command from your Kafka installation directory
+
+```
+./bin/kafka-console-consumer --zookeeper localhost --topic metrics_pb
+```
+
+which should print messages like this
+
+```
+millisecondsGETR"2017-04-06T03:23:56Z*2002/list:request/latencyBwww1.example.com
+```
+
+If your supervisor created in the previous step is running, the indexing tasks should begin producing the messages and the data will soon be available for querying in Druid.
+
+## Generating the example files
+
+The files provided in the example quickstart can be generated in the following manner starting with only `metrics.proto`.
+
+### `metrics.desc`
+
+The descriptor file is generated using `protoc` Protobuf compiler. Given a `.proto` file, a `.desc` file can be generated like so.
+
+```
+protoc -o metrics.desc metrics.proto
+```
+
+### `metrics_pb2.py`
+`metrics_pb2.py` is also generated with `protoc`
+
+```
+ protoc -o metrics.desc metrics.proto --python_out=.
+```
 
+### `pb_publisher.py`
+After `metrics_pb2.py` is generated, another script can be constructed to parse JSON data, convert it to Protobuf, and produce to a Kafka topic
+
+```python
 
 Review comment:
   Oops, accidentally left that off when i moved stuff around :+1:

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org
For additional commands, e-mail: commits-help@druid.apache.org