You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Mark Kerzner <ma...@gmail.com> on 2011/02/03 21:23:55 UTC

Type mismatch

Hi,

I have this code to read and write to HBase from MR, and it works fine with
0 reducers, but it gives a type mismatch error when with 1 reducer. What
should I look at? *Thank you!*

*Code:*

    static class RowCounterMapper
            extends TableMapper<Text, IntWritable> {

        private static enum Counters {

            ROWS
        }

        @Override
        public void map(ImmutableBytesWritable row, Result values, Context
context)
                throws IOException, InterruptedException {
            for (KeyValue value : values.list()) {
                if (value.getValue().length > 0) {
                    Text key = new Text(value.getValue());
                    context.write(key, ONE);
                }
            }
        }
    }

*Error: *

java.io.IOException: Type mismatch in key from map: expected
org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
org.apache.hadoop.io.Text

Re: Type mismatch

Posted by Mark Kerzner <ma...@gmail.com>.
I am on 0.89 from CDH3
I tried IdentityTableReducer, but get the same error
I will try 0.90. Should I include the HBase code, so that I can step through
it?

On Fri, Feb 4, 2011 at 12:38 AM, Stack <st...@duboce.net> wrote:

> I'm not sure whats up w/ your sample above.  Here's some observations
> that might help.
>
> Here is the code.  Our line numbers differ.  You are not on 0.90.0?
> Thats not important.  You are in this method it seems:
>
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
>  See the message on the end.  You should submit a patch where we add
> to the IOException message a toString on the value passed so we have a
> better clue as to where we are off here -- so you can see class of
> object submitted (debugging, I'd add this to the log message).
>
> Looking at how you declare TOF, it doesn't look right (This helps with
> that:
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html
> ).
>  It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>  but
> you are outputting a Text for KEYOUT, not the declared Put.  This is
> probably not your prob. though.
>
> Looking at IdentityTableReducer, it just passes Writables with the
> value a Delete or Put.
>
> St.Ack
>
>
>
>
>
> On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <ma...@gmail.com>
> wrote:
> > Thank you, St.Ack, it is very nice of you to keep helping me. Here is the
> > stack :) trace, but as you can see, it is the internal Hadoop code. I see
> > this code and I see the message - I am not passing it the right object -
> but
> > how DO I pass the right object?
> >
> > M
> >
> >
> >        at
> >
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
> >        at
> >
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
> >        at
> >
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
> >        at
> >
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> >        at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
> >        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
> >        at
> > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
> >        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
> >        at
> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
> >
> > On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:
> >
> >> Look at the stack trace.  See where its being thrown.  Look at that
> >> src code at that line offset.  Should give you a clue.
> >> St.Ack
> >>
> >> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <ma...@gmail.com>
> >> wrote:
> >> > Thank you, that helped, but now I get this error on trying to write
> back
> >> to
> >> > HBase:
> >> >
> >> > java.io.IOException: Pass a Delete or a Put
> >> >
> >> > Here is a fragment on my code. Again, thanks a bunch!
> >> >
> >> >    public static class RowCounterReducer
> >> >            extends TableReducer <Text, IntWritable, Put>
> >> >    {
> >> >        public void reduce(Text key,
> >> >                Iterable<IntWritable> values,
> >> >                Reducer.Context context)
> >> >                throws IOException,
> >> >                InterruptedException {
> >> >            Iterator <IntWritable> iterator = values.iterator();
> >> >            while (iterator.hasNext()) {
> >> >                IntWritable value = iterator.next();
> >> >                Put put = new Put();
> >> >                context.write(key, put);
> >> >            }
> >> >        }
> >> >    }
> >> >
> >> >
> >> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
> >> >
> >> >> You are emitting a Text type.  Try just passing 'row' to the context,
> >> >> the one passed in to your map.
> >> >> St.Ack
> >> >>
> >> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <markkerzner@gmail.com
> >
> >> >> wrote:
> >> >> > Hi,
> >> >> >
> >> >> > I have this code to read and write to HBase from MR, and it works
> fine
> >> >> with
> >> >> > 0 reducers, but it gives a type mismatch error when with 1 reducer.
> >> What
> >> >> > should I look at? *Thank you!*
> >> >> >
> >> >> > *Code:*
> >> >> >
> >> >> >    static class RowCounterMapper
> >> >> >            extends TableMapper<Text, IntWritable> {
> >> >> >
> >> >> >        private static enum Counters {
> >> >> >
> >> >> >            ROWS
> >> >> >        }
> >> >> >
> >> >> >        @Override
> >> >> >        public void map(ImmutableBytesWritable row, Result values,
> >> Context
> >> >> > context)
> >> >> >                throws IOException, InterruptedException {
> >> >> >            for (KeyValue value : values.list()) {
> >> >> >                if (value.getValue().length > 0) {
> >> >> >                    Text key = new Text(value.getValue());
> >> >> >                    context.write(key, ONE);
> >> >> >                }
> >> >> >            }
> >> >> >        }
> >> >> >    }
> >> >> >
> >> >> > *Error: *
> >> >> >
> >> >> > java.io.IOException: Type mismatch in key from map: expected
> >> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
> >> >> > org.apache.hadoop.io.Text
> >> >> >
> >> >>
> >> >
> >>
> >
>

Re: Type mismatch

Posted by Mark Kerzner <ma...@gmail.com>.
I took Sujee's his example, which worked out of the box, and changed it to
fit my problem. I am going line by line now, to see what he does
differently.

Mark

On Fri, Feb 4, 2011 at 2:01 PM, Stack <st...@duboce.net> wrote:

> (Thanks Sujee)
>
> What did you change in your src to get it going?
>
> St.Ack
>
> On Fri, Feb 4, 2011 at 10:56 AM, Mark Kerzner <ma...@gmail.com>
> wrote:
> > I found an example that works and uses the latest HBase API,
> > http://sujee.net/tech/articles/hbase-map-reduce-freq-counter/, you might
> > know about it, but for me it was very helpful.
> >
> > Mark
> >
> > On Fri, Feb 4, 2011 at 11:55 AM, Stack <st...@duboce.net> wrote:
> >
> >> Its just an issue of matching your outputs to TOF.   There are
> >> examples of Reducer usage in the mapreduce package.  They declare
> >> their types other than how you have it.  See PutSortReducer and
> >> ImportTsv which uses it (and configures it up).
> >>
> >> St.Ack
> >>
> >> On Fri, Feb 4, 2011 at 7:08 AM, Mark Kerzner <ma...@gmail.com>
> >> wrote:
> >> > I tried 0.90 - same error. I am going to try to build HBase from code
> and
> >> > include this code in my debugging session, to step through it. But I
> must
> >> be
> >> > doing something wrong.
> >> >
> >> > How does one write to HBase in the Reducer, is there any example!???
> >> >
> >> > Thank you!
> >> >
> >> > Mark
> >> >
> >> > On Fri, Feb 4, 2011 at 12:38 AM, Stack <st...@duboce.net> wrote:
> >> >
> >> >> I'm not sure whats up w/ your sample above.  Here's some observations
> >> >> that might help.
> >> >>
> >> >> Here is the code.  Our line numbers differ.  You are not on 0.90.0?
> >> >> Thats not important.  You are in this method it seems:
> >> >>
> >> >>
> >>
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
> >> >>  See the message on the end.  You should submit a patch where we add
> >> >> to the IOException message a toString on the value passed so we have
> a
> >> >> better clue as to where we are off here -- so you can see class of
> >> >> object submitted (debugging, I'd add this to the log message).
> >> >>
> >> >> Looking at how you declare TOF, it doesn't look right (This helps
> with
> >> >> that:
> >> >>
> >>
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html
> >> >> ).
> >> >>  It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>
>  but
> >> >> you are outputting a Text for KEYOUT, not the declared Put.  This is
> >> >> probably not your prob. though.
> >> >>
> >> >> Looking at IdentityTableReducer, it just passes Writables with the
> >> >> value a Delete or Put.
> >> >>
> >> >> St.Ack
> >> >>
> >> >>
> >> >>
> >> >>
> >> >>
> >> >> On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <markkerzner@gmail.com
> >
> >> >> wrote:
> >> >> > Thank you, St.Ack, it is very nice of you to keep helping me. Here
> is
> >> the
> >> >> > stack :) trace, but as you can see, it is the internal Hadoop code.
> I
> >> see
> >> >> > this code and I see the message - I am not passing it the right
> object
> >> -
> >> >> but
> >> >> > how DO I pass the right object?
> >> >> >
> >> >> > M
> >> >> >
> >> >> >
> >> >> >        at
> >> >> >
> >> >>
> >>
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
> >> >> >        at
> >> >> >
> >> >>
> >>
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
> >> >> >        at
> >> >> >
> >> >>
> >>
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
> >> >> >        at
> >> >> >
> >> >>
> >>
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> >> >> >        at
> org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
> >> >> >        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
> >> >> >        at
> >> >> >
> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
> >> >> >        at
> org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
> >> >> >        at
> >> >> >
> >> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
> >> >> >
> >> >> > On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:
> >> >> >
> >> >> >> Look at the stack trace.  See where its being thrown.  Look at
> that
> >> >> >> src code at that line offset.  Should give you a clue.
> >> >> >> St.Ack
> >> >> >>
> >> >> >> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <
> markkerzner@gmail.com>
> >> >> >> wrote:
> >> >> >> > Thank you, that helped, but now I get this error on trying to
> write
> >> >> back
> >> >> >> to
> >> >> >> > HBase:
> >> >> >> >
> >> >> >> > java.io.IOException: Pass a Delete or a Put
> >> >> >> >
> >> >> >> > Here is a fragment on my code. Again, thanks a bunch!
> >> >> >> >
> >> >> >> >    public static class RowCounterReducer
> >> >> >> >            extends TableReducer <Text, IntWritable, Put>
> >> >> >> >    {
> >> >> >> >        public void reduce(Text key,
> >> >> >> >                Iterable<IntWritable> values,
> >> >> >> >                Reducer.Context context)
> >> >> >> >                throws IOException,
> >> >> >> >                InterruptedException {
> >> >> >> >            Iterator <IntWritable> iterator = values.iterator();
> >> >> >> >            while (iterator.hasNext()) {
> >> >> >> >                IntWritable value = iterator.next();
> >> >> >> >                Put put = new Put();
> >> >> >> >                context.write(key, put);
> >> >> >> >            }
> >> >> >> >        }
> >> >> >> >    }
> >> >> >> >
> >> >> >> >
> >> >> >> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
> >> >> >> >
> >> >> >> >> You are emitting a Text type.  Try just passing 'row' to the
> >> context,
> >> >> >> >> the one passed in to your map.
> >> >> >> >> St.Ack
> >> >> >> >>
> >> >> >> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <
> >> markkerzner@gmail.com
> >> >> >
> >> >> >> >> wrote:
> >> >> >> >> > Hi,
> >> >> >> >> >
> >> >> >> >> > I have this code to read and write to HBase from MR, and it
> >> works
> >> >> fine
> >> >> >> >> with
> >> >> >> >> > 0 reducers, but it gives a type mismatch error when with 1
> >> reducer.
> >> >> >> What
> >> >> >> >> > should I look at? *Thank you!*
> >> >> >> >> >
> >> >> >> >> > *Code:*
> >> >> >> >> >
> >> >> >> >> >    static class RowCounterMapper
> >> >> >> >> >            extends TableMapper<Text, IntWritable> {
> >> >> >> >> >
> >> >> >> >> >        private static enum Counters {
> >> >> >> >> >
> >> >> >> >> >            ROWS
> >> >> >> >> >        }
> >> >> >> >> >
> >> >> >> >> >        @Override
> >> >> >> >> >        public void map(ImmutableBytesWritable row, Result
> >> values,
> >> >> >> Context
> >> >> >> >> > context)
> >> >> >> >> >                throws IOException, InterruptedException {
> >> >> >> >> >            for (KeyValue value : values.list()) {
> >> >> >> >> >                if (value.getValue().length > 0) {
> >> >> >> >> >                    Text key = new Text(value.getValue());
> >> >> >> >> >                    context.write(key, ONE);
> >> >> >> >> >                }
> >> >> >> >> >            }
> >> >> >> >> >        }
> >> >> >> >> >    }
> >> >> >> >> >
> >> >> >> >> > *Error: *
> >> >> >> >> >
> >> >> >> >> > java.io.IOException: Type mismatch in key from map: expected
> >> >> >> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
> >> >> >> >> > org.apache.hadoop.io.Text
> >> >> >> >> >
> >> >> >> >>
> >> >> >> >
> >> >> >>
> >> >> >
> >> >>
> >> >
> >>
> >
>

Re: Type mismatch

Posted by Stack <st...@duboce.net>.
Thanks for writing back the list Mark (I should have spotted that -- sorry).
St.Ack

On Sun, Feb 6, 2011 at 8:11 PM, Mark Kerzner <ma...@gmail.com> wrote:
> And the correct answer is... instead of this signature
>
>    public static class RowCounterReducer
>            extends TableReducer <Text, IntWritable,
> ImmutableBytesWritable>
>    {
>        public void reduce(Text key,
>                Iterable<IntWritable> values,
>                Reducer.Context context)
>                throws IOException,
>                InterruptedException {
>
> (WRONG!)
>
> I used this signature
>
>     public static class RowCounterReducer extends
> TableReducer<ImmutableBytesWritable, IntWritable, ImmutableBytesWritable> {
>
>        @Override
>        public void reduce(ImmutableBytesWritable key, Iterable<IntWritable>
> values, Context context)
>                throws IOException, InterruptedException {
>
> RIGHT!
>
> Thank you, all.
>
> Mark
>
> On Fri, Feb 4, 2011 at 2:01 PM, Stack <st...@duboce.net> wrote:
>
>> (Thanks Sujee)
>>
>> What did you change in your src to get it going?
>>
>> St.Ack
>>
>> On Fri, Feb 4, 2011 at 10:56 AM, Mark Kerzner <ma...@gmail.com>
>> wrote:
>> > I found an example that works and uses the latest HBase API,
>> > http://sujee.net/tech/articles/hbase-map-reduce-freq-counter/, you might
>> > know about it, but for me it was very helpful.
>> >
>> > Mark
>> >
>> > On Fri, Feb 4, 2011 at 11:55 AM, Stack <st...@duboce.net> wrote:
>> >
>> >> Its just an issue of matching your outputs to TOF.   There are
>> >> examples of Reducer usage in the mapreduce package.  They declare
>> >> their types other than how you have it.  See PutSortReducer and
>> >> ImportTsv which uses it (and configures it up).
>> >>
>> >> St.Ack
>> >>
>> >> On Fri, Feb 4, 2011 at 7:08 AM, Mark Kerzner <ma...@gmail.com>
>> >> wrote:
>> >> > I tried 0.90 - same error. I am going to try to build HBase from code
>> and
>> >> > include this code in my debugging session, to step through it. But I
>> must
>> >> be
>> >> > doing something wrong.
>> >> >
>> >> > How does one write to HBase in the Reducer, is there any example!???
>> >> >
>> >> > Thank you!
>> >> >
>> >> > Mark
>> >> >
>> >> > On Fri, Feb 4, 2011 at 12:38 AM, Stack <st...@duboce.net> wrote:
>> >> >
>> >> >> I'm not sure whats up w/ your sample above.  Here's some observations
>> >> >> that might help.
>> >> >>
>> >> >> Here is the code.  Our line numbers differ.  You are not on 0.90.0?
>> >> >> Thats not important.  You are in this method it seems:
>> >> >>
>> >> >>
>> >>
>> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
>> >> >>  See the message on the end.  You should submit a patch where we add
>> >> >> to the IOException message a toString on the value passed so we have
>> a
>> >> >> better clue as to where we are off here -- so you can see class of
>> >> >> object submitted (debugging, I'd add this to the log message).
>> >> >>
>> >> >> Looking at how you declare TOF, it doesn't look right (This helps
>> with
>> >> >> that:
>> >> >>
>> >>
>> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html
>> >> >> ).
>> >> >>  It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>
>>  but
>> >> >> you are outputting a Text for KEYOUT, not the declared Put.  This is
>> >> >> probably not your prob. though.
>> >> >>
>> >> >> Looking at IdentityTableReducer, it just passes Writables with the
>> >> >> value a Delete or Put.
>> >> >>
>> >> >> St.Ack
>> >> >>
>> >> >>
>> >> >>
>> >> >>
>> >> >>
>> >> >> On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <markkerzner@gmail.com
>> >
>> >> >> wrote:
>> >> >> > Thank you, St.Ack, it is very nice of you to keep helping me. Here
>> is
>> >> the
>> >> >> > stack :) trace, but as you can see, it is the internal Hadoop code.
>> I
>> >> see
>> >> >> > this code and I see the message - I am not passing it the right
>> object
>> >> -
>> >> >> but
>> >> >> > how DO I pass the right object?
>> >> >> >
>> >> >> > M
>> >> >> >
>> >> >> >
>> >> >> >        at
>> >> >> >
>> >> >>
>> >>
>> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
>> >> >> >        at
>> >> >> >
>> >> >>
>> >>
>> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
>> >> >> >        at
>> >> >> >
>> >> >>
>> >>
>> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
>> >> >> >        at
>> >> >> >
>> >> >>
>> >>
>> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>> >> >> >        at
>> org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
>> >> >> >        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
>> >> >> >        at
>> >> >> >
>> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
>> >> >> >        at
>> org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
>> >> >> >        at
>> >> >> >
>> >> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
>> >> >> >
>> >> >> > On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:
>> >> >> >
>> >> >> >> Look at the stack trace.  See where its being thrown.  Look at
>> that
>> >> >> >> src code at that line offset.  Should give you a clue.
>> >> >> >> St.Ack
>> >> >> >>
>> >> >> >> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <
>> markkerzner@gmail.com>
>> >> >> >> wrote:
>> >> >> >> > Thank you, that helped, but now I get this error on trying to
>> write
>> >> >> back
>> >> >> >> to
>> >> >> >> > HBase:
>> >> >> >> >
>> >> >> >> > java.io.IOException: Pass a Delete or a Put
>> >> >> >> >
>> >> >> >> > Here is a fragment on my code. Again, thanks a bunch!
>> >> >> >> >
>> >> >> >> >    public static class RowCounterReducer
>> >> >> >> >            extends TableReducer <Text, IntWritable, Put>
>> >> >> >> >    {
>> >> >> >> >        public void reduce(Text key,
>> >> >> >> >                Iterable<IntWritable> values,
>> >> >> >> >                Reducer.Context context)
>> >> >> >> >                throws IOException,
>> >> >> >> >                InterruptedException {
>> >> >> >> >            Iterator <IntWritable> iterator = values.iterator();
>> >> >> >> >            while (iterator.hasNext()) {
>> >> >> >> >                IntWritable value = iterator.next();
>> >> >> >> >                Put put = new Put();
>> >> >> >> >                context.write(key, put);
>> >> >> >> >            }
>> >> >> >> >        }
>> >> >> >> >    }
>> >> >> >> >
>> >> >> >> >
>> >> >> >> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
>> >> >> >> >
>> >> >> >> >> You are emitting a Text type.  Try just passing 'row' to the
>> >> context,
>> >> >> >> >> the one passed in to your map.
>> >> >> >> >> St.Ack
>> >> >> >> >>
>> >> >> >> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <
>> >> markkerzner@gmail.com
>> >> >> >
>> >> >> >> >> wrote:
>> >> >> >> >> > Hi,
>> >> >> >> >> >
>> >> >> >> >> > I have this code to read and write to HBase from MR, and it
>> >> works
>> >> >> fine
>> >> >> >> >> with
>> >> >> >> >> > 0 reducers, but it gives a type mismatch error when with 1
>> >> reducer.
>> >> >> >> What
>> >> >> >> >> > should I look at? *Thank you!*
>> >> >> >> >> >
>> >> >> >> >> > *Code:*
>> >> >> >> >> >
>> >> >> >> >> >    static class RowCounterMapper
>> >> >> >> >> >            extends TableMapper<Text, IntWritable> {
>> >> >> >> >> >
>> >> >> >> >> >        private static enum Counters {
>> >> >> >> >> >
>> >> >> >> >> >            ROWS
>> >> >> >> >> >        }
>> >> >> >> >> >
>> >> >> >> >> >        @Override
>> >> >> >> >> >        public void map(ImmutableBytesWritable row, Result
>> >> values,
>> >> >> >> Context
>> >> >> >> >> > context)
>> >> >> >> >> >                throws IOException, InterruptedException {
>> >> >> >> >> >            for (KeyValue value : values.list()) {
>> >> >> >> >> >                if (value.getValue().length > 0) {
>> >> >> >> >> >                    Text key = new Text(value.getValue());
>> >> >> >> >> >                    context.write(key, ONE);
>> >> >> >> >> >                }
>> >> >> >> >> >            }
>> >> >> >> >> >        }
>> >> >> >> >> >    }
>> >> >> >> >> >
>> >> >> >> >> > *Error: *
>> >> >> >> >> >
>> >> >> >> >> > java.io.IOException: Type mismatch in key from map: expected
>> >> >> >> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
>> >> >> >> >> > org.apache.hadoop.io.Text
>> >> >> >> >> >
>> >> >> >> >>
>> >> >> >> >
>> >> >> >>
>> >> >> >
>> >> >>
>> >> >
>> >>
>> >
>>
>

Re: Type mismatch

Posted by Mark Kerzner <ma...@gmail.com>.
And the correct answer is... instead of this signature

    public static class RowCounterReducer
            extends TableReducer <Text, IntWritable,
ImmutableBytesWritable>
    {
        public void reduce(Text key,
                Iterable<IntWritable> values,
                Reducer.Context context)
                throws IOException,
                InterruptedException {

(WRONG!)

I used this signature

     public static class RowCounterReducer extends
TableReducer<ImmutableBytesWritable, IntWritable, ImmutableBytesWritable> {

        @Override
        public void reduce(ImmutableBytesWritable key, Iterable<IntWritable>
values, Context context)
                throws IOException, InterruptedException {

RIGHT!

Thank you, all.

Mark

On Fri, Feb 4, 2011 at 2:01 PM, Stack <st...@duboce.net> wrote:

> (Thanks Sujee)
>
> What did you change in your src to get it going?
>
> St.Ack
>
> On Fri, Feb 4, 2011 at 10:56 AM, Mark Kerzner <ma...@gmail.com>
> wrote:
> > I found an example that works and uses the latest HBase API,
> > http://sujee.net/tech/articles/hbase-map-reduce-freq-counter/, you might
> > know about it, but for me it was very helpful.
> >
> > Mark
> >
> > On Fri, Feb 4, 2011 at 11:55 AM, Stack <st...@duboce.net> wrote:
> >
> >> Its just an issue of matching your outputs to TOF.   There are
> >> examples of Reducer usage in the mapreduce package.  They declare
> >> their types other than how you have it.  See PutSortReducer and
> >> ImportTsv which uses it (and configures it up).
> >>
> >> St.Ack
> >>
> >> On Fri, Feb 4, 2011 at 7:08 AM, Mark Kerzner <ma...@gmail.com>
> >> wrote:
> >> > I tried 0.90 - same error. I am going to try to build HBase from code
> and
> >> > include this code in my debugging session, to step through it. But I
> must
> >> be
> >> > doing something wrong.
> >> >
> >> > How does one write to HBase in the Reducer, is there any example!???
> >> >
> >> > Thank you!
> >> >
> >> > Mark
> >> >
> >> > On Fri, Feb 4, 2011 at 12:38 AM, Stack <st...@duboce.net> wrote:
> >> >
> >> >> I'm not sure whats up w/ your sample above.  Here's some observations
> >> >> that might help.
> >> >>
> >> >> Here is the code.  Our line numbers differ.  You are not on 0.90.0?
> >> >> Thats not important.  You are in this method it seems:
> >> >>
> >> >>
> >>
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
> >> >>  See the message on the end.  You should submit a patch where we add
> >> >> to the IOException message a toString on the value passed so we have
> a
> >> >> better clue as to where we are off here -- so you can see class of
> >> >> object submitted (debugging, I'd add this to the log message).
> >> >>
> >> >> Looking at how you declare TOF, it doesn't look right (This helps
> with
> >> >> that:
> >> >>
> >>
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html
> >> >> ).
> >> >>  It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>
>  but
> >> >> you are outputting a Text for KEYOUT, not the declared Put.  This is
> >> >> probably not your prob. though.
> >> >>
> >> >> Looking at IdentityTableReducer, it just passes Writables with the
> >> >> value a Delete or Put.
> >> >>
> >> >> St.Ack
> >> >>
> >> >>
> >> >>
> >> >>
> >> >>
> >> >> On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <markkerzner@gmail.com
> >
> >> >> wrote:
> >> >> > Thank you, St.Ack, it is very nice of you to keep helping me. Here
> is
> >> the
> >> >> > stack :) trace, but as you can see, it is the internal Hadoop code.
> I
> >> see
> >> >> > this code and I see the message - I am not passing it the right
> object
> >> -
> >> >> but
> >> >> > how DO I pass the right object?
> >> >> >
> >> >> > M
> >> >> >
> >> >> >
> >> >> >        at
> >> >> >
> >> >>
> >>
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
> >> >> >        at
> >> >> >
> >> >>
> >>
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
> >> >> >        at
> >> >> >
> >> >>
> >>
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
> >> >> >        at
> >> >> >
> >> >>
> >>
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> >> >> >        at
> org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
> >> >> >        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
> >> >> >        at
> >> >> >
> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
> >> >> >        at
> org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
> >> >> >        at
> >> >> >
> >> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
> >> >> >
> >> >> > On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:
> >> >> >
> >> >> >> Look at the stack trace.  See where its being thrown.  Look at
> that
> >> >> >> src code at that line offset.  Should give you a clue.
> >> >> >> St.Ack
> >> >> >>
> >> >> >> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <
> markkerzner@gmail.com>
> >> >> >> wrote:
> >> >> >> > Thank you, that helped, but now I get this error on trying to
> write
> >> >> back
> >> >> >> to
> >> >> >> > HBase:
> >> >> >> >
> >> >> >> > java.io.IOException: Pass a Delete or a Put
> >> >> >> >
> >> >> >> > Here is a fragment on my code. Again, thanks a bunch!
> >> >> >> >
> >> >> >> >    public static class RowCounterReducer
> >> >> >> >            extends TableReducer <Text, IntWritable, Put>
> >> >> >> >    {
> >> >> >> >        public void reduce(Text key,
> >> >> >> >                Iterable<IntWritable> values,
> >> >> >> >                Reducer.Context context)
> >> >> >> >                throws IOException,
> >> >> >> >                InterruptedException {
> >> >> >> >            Iterator <IntWritable> iterator = values.iterator();
> >> >> >> >            while (iterator.hasNext()) {
> >> >> >> >                IntWritable value = iterator.next();
> >> >> >> >                Put put = new Put();
> >> >> >> >                context.write(key, put);
> >> >> >> >            }
> >> >> >> >        }
> >> >> >> >    }
> >> >> >> >
> >> >> >> >
> >> >> >> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
> >> >> >> >
> >> >> >> >> You are emitting a Text type.  Try just passing 'row' to the
> >> context,
> >> >> >> >> the one passed in to your map.
> >> >> >> >> St.Ack
> >> >> >> >>
> >> >> >> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <
> >> markkerzner@gmail.com
> >> >> >
> >> >> >> >> wrote:
> >> >> >> >> > Hi,
> >> >> >> >> >
> >> >> >> >> > I have this code to read and write to HBase from MR, and it
> >> works
> >> >> fine
> >> >> >> >> with
> >> >> >> >> > 0 reducers, but it gives a type mismatch error when with 1
> >> reducer.
> >> >> >> What
> >> >> >> >> > should I look at? *Thank you!*
> >> >> >> >> >
> >> >> >> >> > *Code:*
> >> >> >> >> >
> >> >> >> >> >    static class RowCounterMapper
> >> >> >> >> >            extends TableMapper<Text, IntWritable> {
> >> >> >> >> >
> >> >> >> >> >        private static enum Counters {
> >> >> >> >> >
> >> >> >> >> >            ROWS
> >> >> >> >> >        }
> >> >> >> >> >
> >> >> >> >> >        @Override
> >> >> >> >> >        public void map(ImmutableBytesWritable row, Result
> >> values,
> >> >> >> Context
> >> >> >> >> > context)
> >> >> >> >> >                throws IOException, InterruptedException {
> >> >> >> >> >            for (KeyValue value : values.list()) {
> >> >> >> >> >                if (value.getValue().length > 0) {
> >> >> >> >> >                    Text key = new Text(value.getValue());
> >> >> >> >> >                    context.write(key, ONE);
> >> >> >> >> >                }
> >> >> >> >> >            }
> >> >> >> >> >        }
> >> >> >> >> >    }
> >> >> >> >> >
> >> >> >> >> > *Error: *
> >> >> >> >> >
> >> >> >> >> > java.io.IOException: Type mismatch in key from map: expected
> >> >> >> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
> >> >> >> >> > org.apache.hadoop.io.Text
> >> >> >> >> >
> >> >> >> >>
> >> >> >> >
> >> >> >>
> >> >> >
> >> >>
> >> >
> >>
> >
>

Re: Type mismatch

Posted by Stack <st...@duboce.net>.
(Thanks Sujee)

What did you change in your src to get it going?

St.Ack

On Fri, Feb 4, 2011 at 10:56 AM, Mark Kerzner <ma...@gmail.com> wrote:
> I found an example that works and uses the latest HBase API,
> http://sujee.net/tech/articles/hbase-map-reduce-freq-counter/, you might
> know about it, but for me it was very helpful.
>
> Mark
>
> On Fri, Feb 4, 2011 at 11:55 AM, Stack <st...@duboce.net> wrote:
>
>> Its just an issue of matching your outputs to TOF.   There are
>> examples of Reducer usage in the mapreduce package.  They declare
>> their types other than how you have it.  See PutSortReducer and
>> ImportTsv which uses it (and configures it up).
>>
>> St.Ack
>>
>> On Fri, Feb 4, 2011 at 7:08 AM, Mark Kerzner <ma...@gmail.com>
>> wrote:
>> > I tried 0.90 - same error. I am going to try to build HBase from code and
>> > include this code in my debugging session, to step through it. But I must
>> be
>> > doing something wrong.
>> >
>> > How does one write to HBase in the Reducer, is there any example!???
>> >
>> > Thank you!
>> >
>> > Mark
>> >
>> > On Fri, Feb 4, 2011 at 12:38 AM, Stack <st...@duboce.net> wrote:
>> >
>> >> I'm not sure whats up w/ your sample above.  Here's some observations
>> >> that might help.
>> >>
>> >> Here is the code.  Our line numbers differ.  You are not on 0.90.0?
>> >> Thats not important.  You are in this method it seems:
>> >>
>> >>
>> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
>> >>  See the message on the end.  You should submit a patch where we add
>> >> to the IOException message a toString on the value passed so we have a
>> >> better clue as to where we are off here -- so you can see class of
>> >> object submitted (debugging, I'd add this to the log message).
>> >>
>> >> Looking at how you declare TOF, it doesn't look right (This helps with
>> >> that:
>> >>
>> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html
>> >> ).
>> >>  It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>  but
>> >> you are outputting a Text for KEYOUT, not the declared Put.  This is
>> >> probably not your prob. though.
>> >>
>> >> Looking at IdentityTableReducer, it just passes Writables with the
>> >> value a Delete or Put.
>> >>
>> >> St.Ack
>> >>
>> >>
>> >>
>> >>
>> >>
>> >> On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <ma...@gmail.com>
>> >> wrote:
>> >> > Thank you, St.Ack, it is very nice of you to keep helping me. Here is
>> the
>> >> > stack :) trace, but as you can see, it is the internal Hadoop code. I
>> see
>> >> > this code and I see the message - I am not passing it the right object
>> -
>> >> but
>> >> > how DO I pass the right object?
>> >> >
>> >> > M
>> >> >
>> >> >
>> >> >        at
>> >> >
>> >>
>> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
>> >> >        at
>> >> >
>> >>
>> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
>> >> >        at
>> >> >
>> >>
>> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
>> >> >        at
>> >> >
>> >>
>> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>> >> >        at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
>> >> >        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
>> >> >        at
>> >> > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
>> >> >        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
>> >> >        at
>> >> >
>> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
>> >> >
>> >> > On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:
>> >> >
>> >> >> Look at the stack trace.  See where its being thrown.  Look at that
>> >> >> src code at that line offset.  Should give you a clue.
>> >> >> St.Ack
>> >> >>
>> >> >> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <ma...@gmail.com>
>> >> >> wrote:
>> >> >> > Thank you, that helped, but now I get this error on trying to write
>> >> back
>> >> >> to
>> >> >> > HBase:
>> >> >> >
>> >> >> > java.io.IOException: Pass a Delete or a Put
>> >> >> >
>> >> >> > Here is a fragment on my code. Again, thanks a bunch!
>> >> >> >
>> >> >> >    public static class RowCounterReducer
>> >> >> >            extends TableReducer <Text, IntWritable, Put>
>> >> >> >    {
>> >> >> >        public void reduce(Text key,
>> >> >> >                Iterable<IntWritable> values,
>> >> >> >                Reducer.Context context)
>> >> >> >                throws IOException,
>> >> >> >                InterruptedException {
>> >> >> >            Iterator <IntWritable> iterator = values.iterator();
>> >> >> >            while (iterator.hasNext()) {
>> >> >> >                IntWritable value = iterator.next();
>> >> >> >                Put put = new Put();
>> >> >> >                context.write(key, put);
>> >> >> >            }
>> >> >> >        }
>> >> >> >    }
>> >> >> >
>> >> >> >
>> >> >> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
>> >> >> >
>> >> >> >> You are emitting a Text type.  Try just passing 'row' to the
>> context,
>> >> >> >> the one passed in to your map.
>> >> >> >> St.Ack
>> >> >> >>
>> >> >> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <
>> markkerzner@gmail.com
>> >> >
>> >> >> >> wrote:
>> >> >> >> > Hi,
>> >> >> >> >
>> >> >> >> > I have this code to read and write to HBase from MR, and it
>> works
>> >> fine
>> >> >> >> with
>> >> >> >> > 0 reducers, but it gives a type mismatch error when with 1
>> reducer.
>> >> >> What
>> >> >> >> > should I look at? *Thank you!*
>> >> >> >> >
>> >> >> >> > *Code:*
>> >> >> >> >
>> >> >> >> >    static class RowCounterMapper
>> >> >> >> >            extends TableMapper<Text, IntWritable> {
>> >> >> >> >
>> >> >> >> >        private static enum Counters {
>> >> >> >> >
>> >> >> >> >            ROWS
>> >> >> >> >        }
>> >> >> >> >
>> >> >> >> >        @Override
>> >> >> >> >        public void map(ImmutableBytesWritable row, Result
>> values,
>> >> >> Context
>> >> >> >> > context)
>> >> >> >> >                throws IOException, InterruptedException {
>> >> >> >> >            for (KeyValue value : values.list()) {
>> >> >> >> >                if (value.getValue().length > 0) {
>> >> >> >> >                    Text key = new Text(value.getValue());
>> >> >> >> >                    context.write(key, ONE);
>> >> >> >> >                }
>> >> >> >> >            }
>> >> >> >> >        }
>> >> >> >> >    }
>> >> >> >> >
>> >> >> >> > *Error: *
>> >> >> >> >
>> >> >> >> > java.io.IOException: Type mismatch in key from map: expected
>> >> >> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
>> >> >> >> > org.apache.hadoop.io.Text
>> >> >> >> >
>> >> >> >>
>> >> >> >
>> >> >>
>> >> >
>> >>
>> >
>>
>

Re: Type mismatch

Posted by Mark Kerzner <ma...@gmail.com>.
I found an example that works and uses the latest HBase API,
http://sujee.net/tech/articles/hbase-map-reduce-freq-counter/, you might
know about it, but for me it was very helpful.

Mark

On Fri, Feb 4, 2011 at 11:55 AM, Stack <st...@duboce.net> wrote:

> Its just an issue of matching your outputs to TOF.   There are
> examples of Reducer usage in the mapreduce package.  They declare
> their types other than how you have it.  See PutSortReducer and
> ImportTsv which uses it (and configures it up).
>
> St.Ack
>
> On Fri, Feb 4, 2011 at 7:08 AM, Mark Kerzner <ma...@gmail.com>
> wrote:
> > I tried 0.90 - same error. I am going to try to build HBase from code and
> > include this code in my debugging session, to step through it. But I must
> be
> > doing something wrong.
> >
> > How does one write to HBase in the Reducer, is there any example!???
> >
> > Thank you!
> >
> > Mark
> >
> > On Fri, Feb 4, 2011 at 12:38 AM, Stack <st...@duboce.net> wrote:
> >
> >> I'm not sure whats up w/ your sample above.  Here's some observations
> >> that might help.
> >>
> >> Here is the code.  Our line numbers differ.  You are not on 0.90.0?
> >> Thats not important.  You are in this method it seems:
> >>
> >>
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
> >>  See the message on the end.  You should submit a patch where we add
> >> to the IOException message a toString on the value passed so we have a
> >> better clue as to where we are off here -- so you can see class of
> >> object submitted (debugging, I'd add this to the log message).
> >>
> >> Looking at how you declare TOF, it doesn't look right (This helps with
> >> that:
> >>
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html
> >> ).
> >>  It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>  but
> >> you are outputting a Text for KEYOUT, not the declared Put.  This is
> >> probably not your prob. though.
> >>
> >> Looking at IdentityTableReducer, it just passes Writables with the
> >> value a Delete or Put.
> >>
> >> St.Ack
> >>
> >>
> >>
> >>
> >>
> >> On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <ma...@gmail.com>
> >> wrote:
> >> > Thank you, St.Ack, it is very nice of you to keep helping me. Here is
> the
> >> > stack :) trace, but as you can see, it is the internal Hadoop code. I
> see
> >> > this code and I see the message - I am not passing it the right object
> -
> >> but
> >> > how DO I pass the right object?
> >> >
> >> > M
> >> >
> >> >
> >> >        at
> >> >
> >>
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
> >> >        at
> >> >
> >>
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
> >> >        at
> >> >
> >>
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
> >> >        at
> >> >
> >>
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> >> >        at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
> >> >        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
> >> >        at
> >> > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
> >> >        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
> >> >        at
> >> >
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
> >> >
> >> > On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:
> >> >
> >> >> Look at the stack trace.  See where its being thrown.  Look at that
> >> >> src code at that line offset.  Should give you a clue.
> >> >> St.Ack
> >> >>
> >> >> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <ma...@gmail.com>
> >> >> wrote:
> >> >> > Thank you, that helped, but now I get this error on trying to write
> >> back
> >> >> to
> >> >> > HBase:
> >> >> >
> >> >> > java.io.IOException: Pass a Delete or a Put
> >> >> >
> >> >> > Here is a fragment on my code. Again, thanks a bunch!
> >> >> >
> >> >> >    public static class RowCounterReducer
> >> >> >            extends TableReducer <Text, IntWritable, Put>
> >> >> >    {
> >> >> >        public void reduce(Text key,
> >> >> >                Iterable<IntWritable> values,
> >> >> >                Reducer.Context context)
> >> >> >                throws IOException,
> >> >> >                InterruptedException {
> >> >> >            Iterator <IntWritable> iterator = values.iterator();
> >> >> >            while (iterator.hasNext()) {
> >> >> >                IntWritable value = iterator.next();
> >> >> >                Put put = new Put();
> >> >> >                context.write(key, put);
> >> >> >            }
> >> >> >        }
> >> >> >    }
> >> >> >
> >> >> >
> >> >> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
> >> >> >
> >> >> >> You are emitting a Text type.  Try just passing 'row' to the
> context,
> >> >> >> the one passed in to your map.
> >> >> >> St.Ack
> >> >> >>
> >> >> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <
> markkerzner@gmail.com
> >> >
> >> >> >> wrote:
> >> >> >> > Hi,
> >> >> >> >
> >> >> >> > I have this code to read and write to HBase from MR, and it
> works
> >> fine
> >> >> >> with
> >> >> >> > 0 reducers, but it gives a type mismatch error when with 1
> reducer.
> >> >> What
> >> >> >> > should I look at? *Thank you!*
> >> >> >> >
> >> >> >> > *Code:*
> >> >> >> >
> >> >> >> >    static class RowCounterMapper
> >> >> >> >            extends TableMapper<Text, IntWritable> {
> >> >> >> >
> >> >> >> >        private static enum Counters {
> >> >> >> >
> >> >> >> >            ROWS
> >> >> >> >        }
> >> >> >> >
> >> >> >> >        @Override
> >> >> >> >        public void map(ImmutableBytesWritable row, Result
> values,
> >> >> Context
> >> >> >> > context)
> >> >> >> >                throws IOException, InterruptedException {
> >> >> >> >            for (KeyValue value : values.list()) {
> >> >> >> >                if (value.getValue().length > 0) {
> >> >> >> >                    Text key = new Text(value.getValue());
> >> >> >> >                    context.write(key, ONE);
> >> >> >> >                }
> >> >> >> >            }
> >> >> >> >        }
> >> >> >> >    }
> >> >> >> >
> >> >> >> > *Error: *
> >> >> >> >
> >> >> >> > java.io.IOException: Type mismatch in key from map: expected
> >> >> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
> >> >> >> > org.apache.hadoop.io.Text
> >> >> >> >
> >> >> >>
> >> >> >
> >> >>
> >> >
> >>
> >
>

Re: Type mismatch

Posted by Stack <st...@duboce.net>.
Its just an issue of matching your outputs to TOF.   There are
examples of Reducer usage in the mapreduce package.  They declare
their types other than how you have it.  See PutSortReducer and
ImportTsv which uses it (and configures it up).

St.Ack

On Fri, Feb 4, 2011 at 7:08 AM, Mark Kerzner <ma...@gmail.com> wrote:
> I tried 0.90 - same error. I am going to try to build HBase from code and
> include this code in my debugging session, to step through it. But I must be
> doing something wrong.
>
> How does one write to HBase in the Reducer, is there any example!???
>
> Thank you!
>
> Mark
>
> On Fri, Feb 4, 2011 at 12:38 AM, Stack <st...@duboce.net> wrote:
>
>> I'm not sure whats up w/ your sample above.  Here's some observations
>> that might help.
>>
>> Here is the code.  Our line numbers differ.  You are not on 0.90.0?
>> Thats not important.  You are in this method it seems:
>>
>> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
>>  See the message on the end.  You should submit a patch where we add
>> to the IOException message a toString on the value passed so we have a
>> better clue as to where we are off here -- so you can see class of
>> object submitted (debugging, I'd add this to the log message).
>>
>> Looking at how you declare TOF, it doesn't look right (This helps with
>> that:
>> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html
>> ).
>>  It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>  but
>> you are outputting a Text for KEYOUT, not the declared Put.  This is
>> probably not your prob. though.
>>
>> Looking at IdentityTableReducer, it just passes Writables with the
>> value a Delete or Put.
>>
>> St.Ack
>>
>>
>>
>>
>>
>> On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <ma...@gmail.com>
>> wrote:
>> > Thank you, St.Ack, it is very nice of you to keep helping me. Here is the
>> > stack :) trace, but as you can see, it is the internal Hadoop code. I see
>> > this code and I see the message - I am not passing it the right object -
>> but
>> > how DO I pass the right object?
>> >
>> > M
>> >
>> >
>> >        at
>> >
>> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
>> >        at
>> >
>> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
>> >        at
>> >
>> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
>> >        at
>> >
>> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>> >        at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
>> >        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
>> >        at
>> > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
>> >        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
>> >        at
>> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
>> >
>> > On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:
>> >
>> >> Look at the stack trace.  See where its being thrown.  Look at that
>> >> src code at that line offset.  Should give you a clue.
>> >> St.Ack
>> >>
>> >> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <ma...@gmail.com>
>> >> wrote:
>> >> > Thank you, that helped, but now I get this error on trying to write
>> back
>> >> to
>> >> > HBase:
>> >> >
>> >> > java.io.IOException: Pass a Delete or a Put
>> >> >
>> >> > Here is a fragment on my code. Again, thanks a bunch!
>> >> >
>> >> >    public static class RowCounterReducer
>> >> >            extends TableReducer <Text, IntWritable, Put>
>> >> >    {
>> >> >        public void reduce(Text key,
>> >> >                Iterable<IntWritable> values,
>> >> >                Reducer.Context context)
>> >> >                throws IOException,
>> >> >                InterruptedException {
>> >> >            Iterator <IntWritable> iterator = values.iterator();
>> >> >            while (iterator.hasNext()) {
>> >> >                IntWritable value = iterator.next();
>> >> >                Put put = new Put();
>> >> >                context.write(key, put);
>> >> >            }
>> >> >        }
>> >> >    }
>> >> >
>> >> >
>> >> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
>> >> >
>> >> >> You are emitting a Text type.  Try just passing 'row' to the context,
>> >> >> the one passed in to your map.
>> >> >> St.Ack
>> >> >>
>> >> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <markkerzner@gmail.com
>> >
>> >> >> wrote:
>> >> >> > Hi,
>> >> >> >
>> >> >> > I have this code to read and write to HBase from MR, and it works
>> fine
>> >> >> with
>> >> >> > 0 reducers, but it gives a type mismatch error when with 1 reducer.
>> >> What
>> >> >> > should I look at? *Thank you!*
>> >> >> >
>> >> >> > *Code:*
>> >> >> >
>> >> >> >    static class RowCounterMapper
>> >> >> >            extends TableMapper<Text, IntWritable> {
>> >> >> >
>> >> >> >        private static enum Counters {
>> >> >> >
>> >> >> >            ROWS
>> >> >> >        }
>> >> >> >
>> >> >> >        @Override
>> >> >> >        public void map(ImmutableBytesWritable row, Result values,
>> >> Context
>> >> >> > context)
>> >> >> >                throws IOException, InterruptedException {
>> >> >> >            for (KeyValue value : values.list()) {
>> >> >> >                if (value.getValue().length > 0) {
>> >> >> >                    Text key = new Text(value.getValue());
>> >> >> >                    context.write(key, ONE);
>> >> >> >                }
>> >> >> >            }
>> >> >> >        }
>> >> >> >    }
>> >> >> >
>> >> >> > *Error: *
>> >> >> >
>> >> >> > java.io.IOException: Type mismatch in key from map: expected
>> >> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
>> >> >> > org.apache.hadoop.io.Text
>> >> >> >
>> >> >>
>> >> >
>> >>
>> >
>>
>

Re: Type mismatch

Posted by Mark Kerzner <ma...@gmail.com>.
I tried 0.90 - same error. I am going to try to build HBase from code and
include this code in my debugging session, to step through it. But I must be
doing something wrong.

How does one write to HBase in the Reducer, is there any example!???

Thank you!

Mark

On Fri, Feb 4, 2011 at 12:38 AM, Stack <st...@duboce.net> wrote:

> I'm not sure whats up w/ your sample above.  Here's some observations
> that might help.
>
> Here is the code.  Our line numbers differ.  You are not on 0.90.0?
> Thats not important.  You are in this method it seems:
>
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
>  See the message on the end.  You should submit a patch where we add
> to the IOException message a toString on the value passed so we have a
> better clue as to where we are off here -- so you can see class of
> object submitted (debugging, I'd add this to the log message).
>
> Looking at how you declare TOF, it doesn't look right (This helps with
> that:
> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html
> ).
>  It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>  but
> you are outputting a Text for KEYOUT, not the declared Put.  This is
> probably not your prob. though.
>
> Looking at IdentityTableReducer, it just passes Writables with the
> value a Delete or Put.
>
> St.Ack
>
>
>
>
>
> On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <ma...@gmail.com>
> wrote:
> > Thank you, St.Ack, it is very nice of you to keep helping me. Here is the
> > stack :) trace, but as you can see, it is the internal Hadoop code. I see
> > this code and I see the message - I am not passing it the right object -
> but
> > how DO I pass the right object?
> >
> > M
> >
> >
> >        at
> >
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
> >        at
> >
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
> >        at
> >
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
> >        at
> >
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> >        at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
> >        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
> >        at
> > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
> >        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
> >        at
> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
> >
> > On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:
> >
> >> Look at the stack trace.  See where its being thrown.  Look at that
> >> src code at that line offset.  Should give you a clue.
> >> St.Ack
> >>
> >> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <ma...@gmail.com>
> >> wrote:
> >> > Thank you, that helped, but now I get this error on trying to write
> back
> >> to
> >> > HBase:
> >> >
> >> > java.io.IOException: Pass a Delete or a Put
> >> >
> >> > Here is a fragment on my code. Again, thanks a bunch!
> >> >
> >> >    public static class RowCounterReducer
> >> >            extends TableReducer <Text, IntWritable, Put>
> >> >    {
> >> >        public void reduce(Text key,
> >> >                Iterable<IntWritable> values,
> >> >                Reducer.Context context)
> >> >                throws IOException,
> >> >                InterruptedException {
> >> >            Iterator <IntWritable> iterator = values.iterator();
> >> >            while (iterator.hasNext()) {
> >> >                IntWritable value = iterator.next();
> >> >                Put put = new Put();
> >> >                context.write(key, put);
> >> >            }
> >> >        }
> >> >    }
> >> >
> >> >
> >> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
> >> >
> >> >> You are emitting a Text type.  Try just passing 'row' to the context,
> >> >> the one passed in to your map.
> >> >> St.Ack
> >> >>
> >> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <markkerzner@gmail.com
> >
> >> >> wrote:
> >> >> > Hi,
> >> >> >
> >> >> > I have this code to read and write to HBase from MR, and it works
> fine
> >> >> with
> >> >> > 0 reducers, but it gives a type mismatch error when with 1 reducer.
> >> What
> >> >> > should I look at? *Thank you!*
> >> >> >
> >> >> > *Code:*
> >> >> >
> >> >> >    static class RowCounterMapper
> >> >> >            extends TableMapper<Text, IntWritable> {
> >> >> >
> >> >> >        private static enum Counters {
> >> >> >
> >> >> >            ROWS
> >> >> >        }
> >> >> >
> >> >> >        @Override
> >> >> >        public void map(ImmutableBytesWritable row, Result values,
> >> Context
> >> >> > context)
> >> >> >                throws IOException, InterruptedException {
> >> >> >            for (KeyValue value : values.list()) {
> >> >> >                if (value.getValue().length > 0) {
> >> >> >                    Text key = new Text(value.getValue());
> >> >> >                    context.write(key, ONE);
> >> >> >                }
> >> >> >            }
> >> >> >        }
> >> >> >    }
> >> >> >
> >> >> > *Error: *
> >> >> >
> >> >> > java.io.IOException: Type mismatch in key from map: expected
> >> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
> >> >> > org.apache.hadoop.io.Text
> >> >> >
> >> >>
> >> >
> >>
> >
>

Re: Type mismatch

Posted by Stack <st...@duboce.net>.
I'm not sure whats up w/ your sample above.  Here's some observations
that might help.

Here is the code.  Our line numbers differ.  You are not on 0.90.0?
Thats not important.  You are in this method it seems:
http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
 See the message on the end.  You should submit a patch where we add
to the IOException message a toString on the value passed so we have a
better clue as to where we are off here -- so you can see class of
object submitted (debugging, I'd add this to the log message).

Looking at how you declare TOF, it doesn't look right (This helps with
that: http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html).
 It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>  but
you are outputting a Text for KEYOUT, not the declared Put.  This is
probably not your prob. though.

Looking at IdentityTableReducer, it just passes Writables with the
value a Delete or Put.

St.Ack





On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <ma...@gmail.com> wrote:
> Thank you, St.Ack, it is very nice of you to keep helping me. Here is the
> stack :) trace, but as you can see, it is the internal Hadoop code. I see
> this code and I see the message - I am not passing it the right object - but
> how DO I pass the right object?
>
> M
>
>
>        at
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
>        at
> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
>        at
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
>        at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>        at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
>        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
>        at
> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
>        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
>        at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
>
> On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:
>
>> Look at the stack trace.  See where its being thrown.  Look at that
>> src code at that line offset.  Should give you a clue.
>> St.Ack
>>
>> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <ma...@gmail.com>
>> wrote:
>> > Thank you, that helped, but now I get this error on trying to write back
>> to
>> > HBase:
>> >
>> > java.io.IOException: Pass a Delete or a Put
>> >
>> > Here is a fragment on my code. Again, thanks a bunch!
>> >
>> >    public static class RowCounterReducer
>> >            extends TableReducer <Text, IntWritable, Put>
>> >    {
>> >        public void reduce(Text key,
>> >                Iterable<IntWritable> values,
>> >                Reducer.Context context)
>> >                throws IOException,
>> >                InterruptedException {
>> >            Iterator <IntWritable> iterator = values.iterator();
>> >            while (iterator.hasNext()) {
>> >                IntWritable value = iterator.next();
>> >                Put put = new Put();
>> >                context.write(key, put);
>> >            }
>> >        }
>> >    }
>> >
>> >
>> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
>> >
>> >> You are emitting a Text type.  Try just passing 'row' to the context,
>> >> the one passed in to your map.
>> >> St.Ack
>> >>
>> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <ma...@gmail.com>
>> >> wrote:
>> >> > Hi,
>> >> >
>> >> > I have this code to read and write to HBase from MR, and it works fine
>> >> with
>> >> > 0 reducers, but it gives a type mismatch error when with 1 reducer.
>> What
>> >> > should I look at? *Thank you!*
>> >> >
>> >> > *Code:*
>> >> >
>> >> >    static class RowCounterMapper
>> >> >            extends TableMapper<Text, IntWritable> {
>> >> >
>> >> >        private static enum Counters {
>> >> >
>> >> >            ROWS
>> >> >        }
>> >> >
>> >> >        @Override
>> >> >        public void map(ImmutableBytesWritable row, Result values,
>> Context
>> >> > context)
>> >> >                throws IOException, InterruptedException {
>> >> >            for (KeyValue value : values.list()) {
>> >> >                if (value.getValue().length > 0) {
>> >> >                    Text key = new Text(value.getValue());
>> >> >                    context.write(key, ONE);
>> >> >                }
>> >> >            }
>> >> >        }
>> >> >    }
>> >> >
>> >> > *Error: *
>> >> >
>> >> > java.io.IOException: Type mismatch in key from map: expected
>> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
>> >> > org.apache.hadoop.io.Text
>> >> >
>> >>
>> >
>>
>

Re: Type mismatch

Posted by Mark Kerzner <ma...@gmail.com>.
Thank you, St.Ack, it is very nice of you to keep helping me. Here is the
stack :) trace, but as you can see, it is the internal Hadoop code. I see
this code and I see the message - I am not passing it the right object - but
how DO I pass the right object?

M


        at
org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
        at
org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
        at
org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
        at
org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
        at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
        at
org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
        at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)

On Thu, Feb 3, 2011 at 11:52 PM, Stack <st...@duboce.net> wrote:

> Look at the stack trace.  See where its being thrown.  Look at that
> src code at that line offset.  Should give you a clue.
> St.Ack
>
> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <ma...@gmail.com>
> wrote:
> > Thank you, that helped, but now I get this error on trying to write back
> to
> > HBase:
> >
> > java.io.IOException: Pass a Delete or a Put
> >
> > Here is a fragment on my code. Again, thanks a bunch!
> >
> >    public static class RowCounterReducer
> >            extends TableReducer <Text, IntWritable, Put>
> >    {
> >        public void reduce(Text key,
> >                Iterable<IntWritable> values,
> >                Reducer.Context context)
> >                throws IOException,
> >                InterruptedException {
> >            Iterator <IntWritable> iterator = values.iterator();
> >            while (iterator.hasNext()) {
> >                IntWritable value = iterator.next();
> >                Put put = new Put();
> >                context.write(key, put);
> >            }
> >        }
> >    }
> >
> >
> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
> >
> >> You are emitting a Text type.  Try just passing 'row' to the context,
> >> the one passed in to your map.
> >> St.Ack
> >>
> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <ma...@gmail.com>
> >> wrote:
> >> > Hi,
> >> >
> >> > I have this code to read and write to HBase from MR, and it works fine
> >> with
> >> > 0 reducers, but it gives a type mismatch error when with 1 reducer.
> What
> >> > should I look at? *Thank you!*
> >> >
> >> > *Code:*
> >> >
> >> >    static class RowCounterMapper
> >> >            extends TableMapper<Text, IntWritable> {
> >> >
> >> >        private static enum Counters {
> >> >
> >> >            ROWS
> >> >        }
> >> >
> >> >        @Override
> >> >        public void map(ImmutableBytesWritable row, Result values,
> Context
> >> > context)
> >> >                throws IOException, InterruptedException {
> >> >            for (KeyValue value : values.list()) {
> >> >                if (value.getValue().length > 0) {
> >> >                    Text key = new Text(value.getValue());
> >> >                    context.write(key, ONE);
> >> >                }
> >> >            }
> >> >        }
> >> >    }
> >> >
> >> > *Error: *
> >> >
> >> > java.io.IOException: Type mismatch in key from map: expected
> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
> >> > org.apache.hadoop.io.Text
> >> >
> >>
> >
>

Re: Type mismatch

Posted by Stack <st...@duboce.net>.
Look at the stack trace.  See where its being thrown.  Look at that
src code at that line offset.  Should give you a clue.
St.Ack

On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <ma...@gmail.com> wrote:
> Thank you, that helped, but now I get this error on trying to write back to
> HBase:
>
> java.io.IOException: Pass a Delete or a Put
>
> Here is a fragment on my code. Again, thanks a bunch!
>
>    public static class RowCounterReducer
>            extends TableReducer <Text, IntWritable, Put>
>    {
>        public void reduce(Text key,
>                Iterable<IntWritable> values,
>                Reducer.Context context)
>                throws IOException,
>                InterruptedException {
>            Iterator <IntWritable> iterator = values.iterator();
>            while (iterator.hasNext()) {
>                IntWritable value = iterator.next();
>                Put put = new Put();
>                context.write(key, put);
>            }
>        }
>    }
>
>
> On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:
>
>> You are emitting a Text type.  Try just passing 'row' to the context,
>> the one passed in to your map.
>> St.Ack
>>
>> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <ma...@gmail.com>
>> wrote:
>> > Hi,
>> >
>> > I have this code to read and write to HBase from MR, and it works fine
>> with
>> > 0 reducers, but it gives a type mismatch error when with 1 reducer. What
>> > should I look at? *Thank you!*
>> >
>> > *Code:*
>> >
>> >    static class RowCounterMapper
>> >            extends TableMapper<Text, IntWritable> {
>> >
>> >        private static enum Counters {
>> >
>> >            ROWS
>> >        }
>> >
>> >        @Override
>> >        public void map(ImmutableBytesWritable row, Result values, Context
>> > context)
>> >                throws IOException, InterruptedException {
>> >            for (KeyValue value : values.list()) {
>> >                if (value.getValue().length > 0) {
>> >                    Text key = new Text(value.getValue());
>> >                    context.write(key, ONE);
>> >                }
>> >            }
>> >        }
>> >    }
>> >
>> > *Error: *
>> >
>> > java.io.IOException: Type mismatch in key from map: expected
>> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
>> > org.apache.hadoop.io.Text
>> >
>>
>

Re: Type mismatch

Posted by Mark Kerzner <ma...@gmail.com>.
Thank you, that helped, but now I get this error on trying to write back to
HBase:

java.io.IOException: Pass a Delete or a Put

Here is a fragment on my code. Again, thanks a bunch!

    public static class RowCounterReducer
            extends TableReducer <Text, IntWritable, Put>
    {
        public void reduce(Text key,
                Iterable<IntWritable> values,
                Reducer.Context context)
                throws IOException,
                InterruptedException {
            Iterator <IntWritable> iterator = values.iterator();
            while (iterator.hasNext()) {
                IntWritable value = iterator.next();
                Put put = new Put();
                context.write(key, put);
            }
        }
    }


On Thu, Feb 3, 2011 at 2:50 PM, Stack <st...@duboce.net> wrote:

> You are emitting a Text type.  Try just passing 'row' to the context,
> the one passed in to your map.
> St.Ack
>
> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <ma...@gmail.com>
> wrote:
> > Hi,
> >
> > I have this code to read and write to HBase from MR, and it works fine
> with
> > 0 reducers, but it gives a type mismatch error when with 1 reducer. What
> > should I look at? *Thank you!*
> >
> > *Code:*
> >
> >    static class RowCounterMapper
> >            extends TableMapper<Text, IntWritable> {
> >
> >        private static enum Counters {
> >
> >            ROWS
> >        }
> >
> >        @Override
> >        public void map(ImmutableBytesWritable row, Result values, Context
> > context)
> >                throws IOException, InterruptedException {
> >            for (KeyValue value : values.list()) {
> >                if (value.getValue().length > 0) {
> >                    Text key = new Text(value.getValue());
> >                    context.write(key, ONE);
> >                }
> >            }
> >        }
> >    }
> >
> > *Error: *
> >
> > java.io.IOException: Type mismatch in key from map: expected
> > org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
> > org.apache.hadoop.io.Text
> >
>

Re: Type mismatch

Posted by Stack <st...@duboce.net>.
You are emitting a Text type.  Try just passing 'row' to the context,
the one passed in to your map.
St.Ack

On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner <ma...@gmail.com> wrote:
> Hi,
>
> I have this code to read and write to HBase from MR, and it works fine with
> 0 reducers, but it gives a type mismatch error when with 1 reducer. What
> should I look at? *Thank you!*
>
> *Code:*
>
>    static class RowCounterMapper
>            extends TableMapper<Text, IntWritable> {
>
>        private static enum Counters {
>
>            ROWS
>        }
>
>        @Override
>        public void map(ImmutableBytesWritable row, Result values, Context
> context)
>                throws IOException, InterruptedException {
>            for (KeyValue value : values.list()) {
>                if (value.getValue().length > 0) {
>                    Text key = new Text(value.getValue());
>                    context.write(key, ONE);
>                }
>            }
>        }
>    }
>
> *Error: *
>
> java.io.IOException: Type mismatch in key from map: expected
> org.apache.hadoop.hbase.io.ImmutableBytesWritable, recieved
> org.apache.hadoop.io.Text
>