You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@avro.apache.org by Manuel Simoni <ms...@gmail.com> on 2012/09/17 14:54:44 UTC

Segfault in StreamReader.read

Hi,

as the first step of implementing Avro for Node.js, I'm playing with a
stupid implementation that takes a JSON-encoded schema and a
JSON-encoded object from JavaScript, and returns a Node binary buffer
containing the Avro encoded data. Obviously, this implementation has
the high overhead of having to JSON.stringify every JS object (and
compiling the schema every time) , but on the positive side it's dead
simple, and peruses Avro's logic for turning JSON into Avro. It will
be replaced with a more performant implementation later.

What I do is:
- compile the schema using compileJsonSchemaFromMemory
- decode the JSON datum into a memory input stream
- encode the datum to a binary output stream

The code is here:

https://github.com/collectivemedia/node-avro/blob/f793da4382b85d7fbe1516a2c55afa076dc23708/node_avro.cc

Then I try to copy the encoded data from the output stream to a
std::vector, so I can fill a Node buffer with it. I create a
StreamReader r and try to read uint8_t's from it into the vector. But
I get a segfault at r.read().

Program received signal SIGSEGV, Segmentation fault.
0x00007ffff6a8bc17 in avro::StreamReader::fill (this=0x7fffffffd630)
at /usr/local/include/avro/Stream.hh:268
268	        while (in_->next(&next_, &n)) {

(gdb) p in_
$1 = (avro::InputStream *) 0x1263180
(gdb) p next_
$2 = (const uint8_t *) 0x0
(gdb) p n
$3 = 0

I'm usign Avro 1.7.1 on
Linux 3.2.0-3-amd64 #1 SMP Mon Jul 23 02:45:17 UTC 2012 x86_64 GNU/Linux

Any hints?

Thanks in advance,
Manuel