You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@avro.apache.org by "Hari Shreedharan (JIRA)" <ji...@apache.org> on 2012/06/09 00:50:23 UTC

[jira] [Updated] (AVRO-1111) Malformed data can cause OutOfMemoryError in Avro IPC

     [ https://issues.apache.org/jira/browse/AVRO-1111?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hari Shreedharan updated AVRO-1111:
-----------------------------------

    Description: 
If an the data the comes in through the Netty channel buffer is not framed correctly, then the incoming data can cause arbitrarily large array lists to be created, causing OutOfMemoryError. 

The relevant code(org.apache.avro.ipc.NettyTransportCodec):

private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
ChannelBuffer buffer) throws Exception {
if (buffer.readableBytes()<8) { return false; }

int serial = buffer.readInt();
listSize = buffer.readInt();
dataPack = new NettyDataPack(serial, new ArrayList<ByteBuffer>(listSize));
return true;
}

If the buffer does not have valid Avro data, the listSize variable can have arbitrary values, causing massive ArrayLists to be created, leading to OutOfMemoryErrors.

  was:
If an the data the comes in through the Netty channel buffer is not framed correctly, then the incoming data can cause arbitrarily large array lists to be created, causing OutOfMemoryError. 

The relevant code(org.apache.avro.ipc.NettyTransportCodec):

private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
ChannelBuffer buffer) throws Exception {
if (buffer.readableBytes()<8) { return false; }

int serial = buffer.readInt();
listSize = buffer.readInt();
dataPack = new NettyDataPack(serial, new ArrayList<ByteBuffer>(listSize));
return true;
}

If the variable - buffer does not have valid Avro data, the listSize variable can have arbitrary values, causing massive ArrayLists to be created, leading to OutOfMemoryErrors.

    
> Malformed data can cause OutOfMemoryError in Avro IPC
> -----------------------------------------------------
>
>                 Key: AVRO-1111
>                 URL: https://issues.apache.org/jira/browse/AVRO-1111
>             Project: Avro
>          Issue Type: Bug
>          Components: java
>    Affects Versions: 1.6.3
>            Reporter: Hari Shreedharan
>
> If an the data the comes in through the Netty channel buffer is not framed correctly, then the incoming data can cause arbitrarily large array lists to be created, causing OutOfMemoryError. 
> The relevant code(org.apache.avro.ipc.NettyTransportCodec):
> private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
> ChannelBuffer buffer) throws Exception {
> if (buffer.readableBytes()<8) { return false; }
> int serial = buffer.readInt();
> listSize = buffer.readInt();
> dataPack = new NettyDataPack(serial, new ArrayList<ByteBuffer>(listSize));
> return true;
> }
> If the buffer does not have valid Avro data, the listSize variable can have arbitrary values, causing massive ArrayLists to be created, leading to OutOfMemoryErrors.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira