You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Ram Vittal <ra...@gmail.com> on 2017/03/15 03:59:55 UTC

Kafka Topics best practice for logging data pipeline use case

We are using latest Kafka and Logstash versions for ingesting several
business apps logs(now few but eventually 100+) into ELK. We have a
standardized logging structure for business apps to log data into Kafka
topics and able to ingest into ELK via Kafka topics input plugin.

Currently, we are using one kafka topic for each business app for pushing
data into logstash. We have 3 logstash consumers with 3 partitions on each
topic.

I am wondering about the best practice for using kafka/logstash. Is the
above config a good approach or is there better approach.

For example, instead of having one kafka topic for each app, should we have
one kafka topic across all apps? What are the pros and cons?

If you are not familiar with Logstash it is part of Elastic stack and it is
just another consumer for Kafka.

Would appreciate your input!
-- 
Thanks,
Ram Vittal