You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@avro.apache.org by Paul Bakker <pa...@luminis.eu> on 2015/03/04 11:58:26 UTC

Problem deserialising using Ruby

Hi all,

I'm experimenting with Avro (and Kafka) and I'm struggling with a problem.
I'm using Java to serialize my data using the following code:

DatumWriter<Log> writer = new SpecificDatumWriter<>(Log.class);
DataFileWriter<Log> dataWriter = new DataFsublileWriter<>(writer);
dataWriter.create(log.getSchema(), out);
dataWriter.append(log);
dataWriter.close();
//Send "out.toByteArray()" to Kafka

This seems to work well. If I use Java to deserialize the message again,
it's successful.

On the receiving side I'm using Ruby, as part of a LogStash plugin (
https://github.com/logstash-plugins/logstash-codec-avro/blob/master/lib/logstash/codecs/avro.rb
)

The relevant code is the following:
  public
  def decode(data)

    datum = StringIO.new(data)
    decoder = Avro::IO::BinaryDecoder.new(datum)
    datum_reader = Avro::IO::DatumReader.new(@schema)
    yield LogStash::Event.new(datum_reader.read(decoder))
  end

This fails with an exception:

{:exception=>#<NoMethodError: undefined method `type_sym' for
nil:NilClass>,
:backtrace=>["/Users/paulb/repo/other/logstash/vendor/bundle/jruby/1.9/gems/avro-1.7.7/lib/avro/io.rb:224:in
`match_schemas'",
"/Users/paulb/repo/other/logstash/vendor/bundle/jruby/1.9/gems/avro-1.7.7/lib/avro/io.rb:280:in
`read_data'",
"/Users/paulb/repo/other/logstash/vendor/bundle/jruby/1.9/gems/avro-1.7.7/lib/avro/io.rb:376:in
`read_union'",
"/Users/paulb/repo/other/logstash/vendor/bundle/jruby/1.9/gems/avro-1.7.7/lib/avro/io.rb:309:in
`read_data'",
"/Users/paulb/repo/other/logstash/vendor/bundle/jruby/1.9/gems/avro-1.7.7/lib/avro/io.rb:275:in
`read'",
"/Users/paulb/repo/other/logstash-codec-avro/lib/logstash/codecs/avro.rb:35:in
`decode'",
"/Users/paulb/repo/other/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-0.1.11/lib/logstash/inputs/kafka.rb:156:in
`queue_event'",
"/Users/paulb/repo/other/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-0.1.11/lib/logstash/inputs/kafka.rb:126:in
`run'", "/Users/paulb/repo/other/logstash/lib/logstash/pipeline.rb:174:in
`inputworker'",
"/Users/paulb/repo/other/logstash/lib/logstash/pipeline.rb:168:in
`start_input'"], :level=>:error}

The @schema is set correctly, and I've tried a few things like forcing the
encoding, but the result remains the same.

I've attached a file that contains the output of the Java code when I write
it to a file.

Some pointer would be very much appreciated :-)

Cheers,

Paul