You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Tom White (JIRA)" <ji...@apache.org> on 2008/11/13 07:11:44 UTC

[jira] Updated: (HADOOP-3788) Add serialization for Protocol Buffers

     [ https://issues.apache.org/jira/browse/HADOOP-3788?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Tom White updated HADOOP-3788:
------------------------------

    Attachment: hadoop-3788-v3.patch

A patch that correctly reads Protocol Buffers from a SequenceFile. MapReduce doesn't work yet since Protocol Buffers always create a new object on deserialization, so we need the new MapReduce interfaces from HADOOP-1230.

I also moved the code to protocol-buffers-serialization under contrib, and updated to the latest PB release.

> Add serialization for Protocol Buffers
> --------------------------------------
>
>                 Key: HADOOP-3788
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3788
>             Project: Hadoop Core
>          Issue Type: Wish
>          Components: contrib/serialization, examples, mapred
>            Reporter: Tom White
>            Assignee: Alex Loddengaard
>             Fix For: 0.20.0
>
>         Attachments: hadoop-3788-v1.patch, hadoop-3788-v2.patch, hadoop-3788-v3.patch, protobuf-java-2.0.1.jar
>
>
> Protocol Buffers (http://code.google.com/p/protobuf/) are a way of encoding data in a compact binary format. This issue is to write a ProtocolBuffersSerialization to support using Protocol Buffers types in MapReduce programs, including an example program. This should probably go into contrib. 

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.