You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@kafka.apache.org by "Matthias J. Sax (JIRA)" <ji...@apache.org> on 2019/03/26 15:50:00 UTC

[jira] [Commented] (KAFKA-8157) Missing "key.serializer" exception when setting "segment index bytes"

    [ https://issues.apache.org/jira/browse/KAFKA-8157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16801865#comment-16801865 ] 

Matthias J. Sax commented on KAFKA-8157:
----------------------------------------

This might affect older versions than 2.2.0, too. We should double check all older versions and fix in all.

> Missing "key.serializer" exception when setting "segment index bytes"
> ---------------------------------------------------------------------
>
>                 Key: KAFKA-8157
>                 URL: https://issues.apache.org/jira/browse/KAFKA-8157
>             Project: Kafka
>          Issue Type: Bug
>          Components: streams
>    Affects Versions: 2.2.0
>         Environment: ubuntu 18.10, localhost and Aiven too
>            Reporter: Cristian D
>            Priority: Major
>              Labels: beginner, newbie
>
> As a `kafka-streams` user,
> When I set the "segment index bytes" property
> Then I would like to have internal topics with the specified allocated disk space
>  
> At the moment, when setting the "topic.segment.index.bytes" property, the application is exiting with following exception: 
> {code:java}
> Exception in thread "main" org.apache.kafka.common.config.ConfigException: Missing required configuration "key.serializer" which has no default value.
> {code}
> Tested with `kafka-streams` v2.0.0 and v2.2.0.
>  
> Stack trace:
> {code:java}
> Exception in thread "main" org.apache.kafka.common.config.ConfigException: Missing required configuration "key.serializer" which has no default value.
>  at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:474)
>  at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:464)
>  at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:62)
>  at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:75)
>  at org.apache.kafka.clients.producer.ProducerConfig.<init>(ProducerConfig.java:392)
>  at org.apache.kafka.streams.StreamsConfig.getMainConsumerConfigs(StreamsConfig.java:1014)
>  at org.apache.kafka.streams.processor.internals.StreamThread.create(StreamThread.java:666)
>  at org.apache.kafka.streams.KafkaStreams.<init>(KafkaStreams.java:718)
>  at org.apache.kafka.streams.KafkaStreams.<init>(KafkaStreams.java:634)
>  at org.apache.kafka.streams.KafkaStreams.<init>(KafkaStreams.java:544)
>  at app.Main.main(Main.java:36)
> {code}
> A demo application simulating the exception:
> https://github.com/razorcd/java-snippets-and-demo-projects/tree/master/kafkastreamsdemo
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)