You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by "peay (JIRA)" <ji...@apache.org> on 2017/03/01 00:20:45 UTC

[jira] [Comment Edited] (BEAM-1573) KafkaIO does not allow using Kafka serializers and deserializers

    [ https://issues.apache.org/jira/browse/BEAM-1573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15888717#comment-15888717 ] 

peay edited comment on BEAM-1573 at 3/1/17 12:20 AM:
-----------------------------------------------------

Actually, thinking more about it, just setting the serializer class in the Kafka properties doesn't work in terms of getting a `Write` with the correct key/value types.

Maybe we can instead have a method along the lines of

{code:title=KafkaIO.java|borderStyle=solid}
    public <ValueT> Write<K, ValueT> withCustomKafkaValueSerializer(Serializer<ValueT> serializer) {
      Map<String, Object> configUpdates = ImmutableMap.of(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
                                                          serializer.getClass());
      Map<String, Object> config = updateKafkaProperties(producerConfig,
          TypedWrite.IGNORED_PRODUCER_PROPERTIES, configUpdates); // Do not actually ignore the VALUE_SERIALIZER_CLASS_CONFIG property in practice

      
      return new Write<K, ValueT>(topic, keyCoder, null, config);
    }
{code}

edit: This is not as simple as that as we need a valid value coder anyway. 


was (Author: peay):
Actually, thinking more about it, just setting the serializer class in the Kafka properties doesn't work in terms of getting a `Write` with the correct key/value types.

Maybe we can instead have a method along the lines of

{code:title=KafkaIO.java|borderStyle=solid}
    public <ValueT> Write<K, ValueT> withCustomKafkaValueSerializer(Serializer<ValueT> serializer) {
      Map<String, Object> configUpdates = ImmutableMap.of(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
                                                          serializer.getClass());
      Map<String, Object> config = updateKafkaProperties(producerConfig,
          TypedWrite.IGNORED_PRODUCER_PROPERTIES, configUpdates); // Do not actually ignore the VALUE_SERIALIZER_CLASS_CONFIG property in practice

      
      return new Write<K, ValueT>(topic, keyCoder, null, config);
    }
{code}

> KafkaIO does not allow using Kafka serializers and deserializers
> ----------------------------------------------------------------
>
>                 Key: BEAM-1573
>                 URL: https://issues.apache.org/jira/browse/BEAM-1573
>             Project: Beam
>          Issue Type: Bug
>          Components: sdk-java-extensions
>    Affects Versions: 0.4.0, 0.5.0
>            Reporter: peay
>            Assignee: Davor Bonaci
>            Priority: Minor
>
> KafkaIO does not allow to override the serializer and deserializer settings of the Kafka consumer and producers it uses internally. Instead, it allows to set a `Coder`, and has a simple Kafka serializer/deserializer wrapper class that calls the coder.
> I appreciate that allowing to use Beam coders is good and consistent with the rest of the system. However, is there a reason to completely disallow to use custom Kafka serializers instead?
> This is a limitation when working with an Avro schema registry for instance, which requires custom serializers. One can write a `Coder` that wraps a custom Kafka serializer, but that means two levels of un-necessary wrapping.
> In addition, the `Coder` abstraction is not equivalent to Kafka's `Serializer` which gets the topic name as input. Using a `Coder` wrapper would require duplicating the output topic setting in the argument to `KafkaIO` and when building the wrapper, which is not elegant and error prone.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)