You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Li Li <fa...@gmail.com> on 2014/03/25 14:07:27 UTC

how to read custom writable object from hdfs use java api?

I have map-reduce job to output my custom writable objects, how can I
read it using pure java api?
I don't want to serialize it to string(Text) and deserialize using java.

map reduce codes..
..
job.setOutputKeyClass(IntWritable.class);
job.setOutputValueClass(MyWritable.class);

Re: how to read custom writable object from hdfs use java api?

Posted by Stanley Shi <ss...@gopivotal.com>.
Since you know the format of your output, write something to read from it
should be very easy.
What's your outputformat? If you didn't set it, please refer to the class
"org.apache.hadoop.mapreduce.lib.output.TextOutputFormat".
you can use the class
"org.apache.hadoop.mapreduce.lib.input.TextInputFormat" to read each record
from you output file.

Regards,
*Stanley Shi,*



On Tue, Mar 25, 2014 at 9:07 PM, Li Li <fa...@gmail.com> wrote:

> I have map-reduce job to output my custom writable objects, how can I
> read it using pure java api?
> I don't want to serialize it to string(Text) and deserialize using java.
>
> map reduce codes..
> ..
> job.setOutputKeyClass(IntWritable.class);
> job.setOutputValueClass(MyWritable.class);
>

Re: how to read custom writable object from hdfs use java api?

Posted by Stanley Shi <ss...@gopivotal.com>.
Since you know the format of your output, write something to read from it
should be very easy.
What's your outputformat? If you didn't set it, please refer to the class
"org.apache.hadoop.mapreduce.lib.output.TextOutputFormat".
you can use the class
"org.apache.hadoop.mapreduce.lib.input.TextInputFormat" to read each record
from you output file.

Regards,
*Stanley Shi,*



On Tue, Mar 25, 2014 at 9:07 PM, Li Li <fa...@gmail.com> wrote:

> I have map-reduce job to output my custom writable objects, how can I
> read it using pure java api?
> I don't want to serialize it to string(Text) and deserialize using java.
>
> map reduce codes..
> ..
> job.setOutputKeyClass(IntWritable.class);
> job.setOutputValueClass(MyWritable.class);
>

Re: how to read custom writable object from hdfs use java api?

Posted by Stanley Shi <ss...@gopivotal.com>.
Since you know the format of your output, write something to read from it
should be very easy.
What's your outputformat? If you didn't set it, please refer to the class
"org.apache.hadoop.mapreduce.lib.output.TextOutputFormat".
you can use the class
"org.apache.hadoop.mapreduce.lib.input.TextInputFormat" to read each record
from you output file.

Regards,
*Stanley Shi,*



On Tue, Mar 25, 2014 at 9:07 PM, Li Li <fa...@gmail.com> wrote:

> I have map-reduce job to output my custom writable objects, how can I
> read it using pure java api?
> I don't want to serialize it to string(Text) and deserialize using java.
>
> map reduce codes..
> ..
> job.setOutputKeyClass(IntWritable.class);
> job.setOutputValueClass(MyWritable.class);
>

Re: how to read custom writable object from hdfs use java api?

Posted by Stanley Shi <ss...@gopivotal.com>.
Since you know the format of your output, write something to read from it
should be very easy.
What's your outputformat? If you didn't set it, please refer to the class
"org.apache.hadoop.mapreduce.lib.output.TextOutputFormat".
you can use the class
"org.apache.hadoop.mapreduce.lib.input.TextInputFormat" to read each record
from you output file.

Regards,
*Stanley Shi,*



On Tue, Mar 25, 2014 at 9:07 PM, Li Li <fa...@gmail.com> wrote:

> I have map-reduce job to output my custom writable objects, how can I
> read it using pure java api?
> I don't want to serialize it to string(Text) and deserialize using java.
>
> map reduce codes..
> ..
> job.setOutputKeyClass(IntWritable.class);
> job.setOutputValueClass(MyWritable.class);
>