You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@kafka.apache.org by "Patrik Kleindl (JIRA)" <ji...@apache.org> on 2018/11/20 11:45:01 UTC
[jira] [Created] (KAFKA-7660) Stream Metrics - Memory Analysis
Patrik Kleindl created KAFKA-7660:
-------------------------------------
Summary: Stream Metrics - Memory Analysis
Key: KAFKA-7660
URL: https://issues.apache.org/jira/browse/KAFKA-7660
Project: Kafka
Issue Type: Bug
Components: metrics, streams
Affects Versions: 2.0.0
Reporter: Patrik Kleindl
Attachments: Mem_Collections.jpeg, Mem_DuplicateStrings.jpeg, Mem_DuplicateStrings2.jpeg, Mem_Hotspots.jpeg, Mem_KeepAliveSet.jpeg, Mem_References.jpeg
During the analysis of JVM memory two possible issues were shown which I would like to bring to your attention:
1) Duplicate strings
Top findings:
string_content="stream-processor-node-metrics" count="534,277"
string_content="processor-node-id" count="148,437"
string_content="stream-rocksdb-state-metrics" count="41,832"
string_content="punctuate-latency-avg" count="29,681"
"stream-processor-node-metrics" seems to be used in Sensors.java as a literal and not interned.
2) The HashMap parentSensors from org.apache.kafka.streams.processor.internals.StreamThread$StreamsMetricsThreadImpl was reported multiple times as suspicious for potentially keeping alive a lot of objects. In our case the reported size was 40-50MB each.
I haven't looked too deep in the code but noticed that the class Sensor.java which is used as a key in the HashMap does not implement equals or hashCode method. Not sure this is a problem though.
The analysis was done with Dynatrace 7.0
We are running Confluent 5.0/Kafka2.0-cp1 (Brokers as well as Clients)
Screenshots are attached.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)