You are viewing a plain text version of this content. The canonical link for it is here.
Posted to general@hadoop.apache.org by Alessandro Binhara <bi...@gmail.com> on 2011/02/17 19:47:56 UTC
simple map reduce
Helo all..
i create a simple map reduce .. to sum a valeu from a input file like :
key, value
993, 3
993, 2
333, 2
etc .. like this..
public void map(LongWritable key, Text value,
OutputCollector<Text, IntWritable> output, Reporter reporter)
throws IOException
{
String line = value.toString();
String fields[] = line.split(",");
output.collect(new Text(fields[0]), new
IntWritable(Integer.parseInt(fields[1])));
}
public void reduce(Text key, Iterator<IntWritable> values,
OutputCollector<Text, IntWritable> output, Reporter reporter)
throws IOException
{
int maxValue = 0;
while (values.hasNext()) {
maxValue += values.next().get();
}
output.collect(key, new IntWritable(maxValue));
}
Well.. my question is..
I had a another structure .. of input file
key, value1, value2, value3
993, 3,2,3
993, 2,1,1
333, 2,2,1
How i can sendo to reduce a list of values, ? To process a list numers and
not only one number?
values must be added separately..
thanks for all ..
Re: simple map reduce
Posted by Owen O'Malley <om...@apache.org>.
Please keep user questions on mapreduce-user as per http://hadoop.apache.org/mailing_lists.html
.
On Feb 18, 2011, at 2:42 AM, Alessandro Binhara wrote:
> I need extend ArrayWritable ??? like this :
>
> public class IntArrayWritable extends ArrayWritable { public
> IntArrayWritable() { super(IntWritable.class); } }
Yes.
-- Owen
Re: simple map reduce
Posted by Owen O'Malley <om...@apache.org>.
Please keep user questions on mapreduce-user as per http://hadoop.apache.org/mailing_lists.html
.
On Feb 18, 2011, at 2:42 AM, Alessandro Binhara wrote:
> I need extend ArrayWritable ??? like this :
>
> public class IntArrayWritable extends ArrayWritable { public
> IntArrayWritable() { super(IntWritable.class); } }
Yes.
-- Owen
Re: simple map reduce
Posted by Alessandro Binhara <bi...@gmail.com>.
I need extend ArrayWritable ??? like this :
public class IntArrayWritable extends ArrayWritable { public
IntArrayWritable() { super(IntWritable.class); } }
On Fri, Feb 18, 2011 at 1:15 AM, Harsh J <qw...@gmail.com> wrote:
> Hello,
>
> On Fri, Feb 18, 2011 at 12:17 AM, Alessandro Binhara <bi...@gmail.com>
> wrote:
> > I had a another structure .. of input file
> > key, value1, value2, value3
> > 993, 3,2,3
> > 993, 2,1,1
> > 333, 2,2,1
> >
> > How i can sendo to reduce a list of values, ? To process a list numers
> and
> > not only one number?
> > values must be added separately..
>
> An IntWritable can't obviously hold a sequence of integers. So you
> need a different data structure to hold a sequence of integers. From
> the available bank of Writables in Hadoop, you can perhaps use an
> ArrayWritable initialized for IntWritable members.
>
> --
> Harsh J
> www.harshj.com
>
Re: simple map reduce
Posted by Harsh J <qw...@gmail.com>.
Hello,
On Fri, Feb 18, 2011 at 12:17 AM, Alessandro Binhara <bi...@gmail.com> wrote:
> I had a another structure .. of input file
> key, value1, value2, value3
> 993, 3,2,3
> 993, 2,1,1
> 333, 2,2,1
>
> How i can sendo to reduce a list of values, ? To process a list numers and
> not only one number?
> values must be added separately..
An IntWritable can't obviously hold a sequence of integers. So you
need a different data structure to hold a sequence of integers. From
the available bank of Writables in Hadoop, you can perhaps use an
ArrayWritable initialized for IntWritable members.
--
Harsh J
www.harshj.com