You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by David Lerman <dl...@videoegg.com> on 2009/07/10 20:28:22 UTC

simple join failing with ClassCastException

Attempting to join three tables is consistently failing with a
ClassCastException using Hive trunk (r792966) and Hadoop 0.18.3.

The three tables are defined as follows:

create table foo (foo_id int, foo_name string, foo_a string, foo_b string,
foo_c string, foo_d string) row format delimited fields terminated by ','
stored as textfile;

create table bar (bar_id int, bar_0 int, foo_id int, bar_1 int, bar_name
string, bar_a string, bar_b string, bar_c string, bar_d string) row format
delimited fields terminated by ',' stored as textfile;

create table count (bar_id int, n int) row format delimited fields
terminated by ',' stored as textfile;

Each table has a single row as follows:

foo:
1,foo1,a,b,c,d

bar:
10,0,1,1,bar10,a,b,c,d

counts:
10,2

The failing query is:

select foo.foo_name, bar.bar_name, n from foo join bar on foo.foo_id =
bar.foo_id join count on count.bar_id = bar.bar_id;

Interestingly, the query works if you reorder the joins (select
foo.foo_name, bar.bar_name, n from count join bar on count.bar_id =
bar.bar_id join foo on foo.foo_id = bar.foo_id) or if you remove any of the
unused string columns from foo or even just move the unused int columns in
bar to the end.

The exception is as follows:

java.lang.ClassCastException:
org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
pector cannot be cast to
org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
r
 at 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
DynamicSerDeTypeString.java:63)
 at 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
ynamicSerDeFieldList.java:249)
 at 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
DynamicSerDeStructBase.java:81)
 at 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
De.java:177)
 at 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
a:180)
 at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
mmonJoinOperator.java:290)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
or.java:533)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
or.java:522)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
or.java:522)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
inOperator.java:563)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
r.java:545)
 at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:159)
 at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
 at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
java.lang.ClassCastException:
org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
pector cannot be cast to
org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
r
 at 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
DynamicSerDeTypeString.java:63)
 at 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
ynamicSerDeFieldList.java:249)
 at 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
DynamicSerDeStructBase.java:81)
 at 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
De.java:177)
 at 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
a:180)
 at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
mmonJoinOperator.java:290)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
or.java:533)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
or.java:522)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
or.java:522)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
inOperator.java:563)
 at 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
r.java:545)
 at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:236)
 at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:329)
 at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)


Thanks for your help!


Re: simple join failing with ClassCastException

Posted by Zheng Shao <zs...@gmail.com>.
Hi David,

I did some analysis and the bug is filed here:
https://issues.apache.org/jira/browse/HIVE-626
I don't have a solution right now but will work on that immediately.

I also fixed HIVE-624 but I verified that patch does not fix the
problem you mentioned.


Thanks for simplifying the test case when reporting the error. It
makes the debugging much easier.

Zheng

On Fri, Jul 10, 2009 at 1:57 PM, David Lerman<dl...@videoegg.com> wrote:
> Thanks Zheng,
>
> It actually appears not to build with the patch...
>
> $ svn status
> ?      HIVE-624.1.patch
> $ svn update
> At revision 793104.
> $ svn info
> Path: .
> URL: http://svn.apache.org/repos/asf/hadoop/hive/trunk
> Repository Root: http://svn.apache.org/repos/asf
> Repository UUID: 13f79535-47bb-0310-9956-ffa450edef68
> Revision: 793104
> Node Kind: directory
> Schedule: normal
> Last Changed Author: namit
> Last Changed Rev: 792772
> Last Changed Date: 2009-07-09 21:04:37 -0400 (Thu, 09 Jul 2009)
> $ patch -p0 < HIVE-624.1.patch
> ...
> $ ant clean
> ...
> $ ant -Dhadoop.version="0.18.3" package
> ...
> compile:
>     [echo] Compiling: hive
>    [javac] Compiling 200 source files to
> /Users/dlerman/Documents/VideoEgg/repo/vdat/ext_unused/hive_trunk/build/serd
> e/classes
>    [javac]
> /Users/dlerman/Documents/VideoEgg/repo/vdat/ext_unused/hive_trunk/serde/src/
> java/org/apache/hadoop/hive/serde2/lazy/objectinspector/primitive/LazyByteOb
> jectInspector.java:30:
> org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyByteObjectI
> nspector is not abstract and does not override abstract method
> set(java.lang.Object,byte) in
> org.apache.hadoop.hive.serde2.objectinspector.primitive.ByteObjectInspector
>    [javac] public class LazyByteObjectInspector extends
> AbstractPrimitiveLazyObjectInspector<ByteWritable>
>    [javac]        ^
>    [javac]
> /Users/dlerman/Documents/VideoEgg/repo/vdat/ext_unused/hive_trunk/serde/src/
> java/org/apache/hadoop/hive/serde2/objectinspector/primitive/JavaByteObjectI
> nspector.java:26:
> org.apache.hadoop.hive.serde2.objectinspector.primitive.JavaByteObjectInspec
> tor is not abstract and does not override abstract method
> set(java.lang.Object,byte) in
> org.apache.hadoop.hive.serde2.objectinspector.primitive.ByteObjectInspector
>    [javac] public class JavaByteObjectInspector extends
> AbstractPrimitiveJavaObjectInspector
>    [javac]        ^
>    [javac] Note: Some input files use unchecked or unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>    [javac] 2 errors
>
> On 7/10/09 4:42 PM, "Zheng Shao" <zs...@gmail.com> wrote:
>
>> Hi David,
>>
>> Please do "ant clean" before running "ant -Dhadoop.version=0.18.x package"
>>
>> Zheng
>>
>> On Fri, Jul 10, 2009 at 12:45 PM, David Lerman<dl...@videoegg.com> wrote:
>>> Thanks Zheng,
>>>
>>> Applying the 624 patch changes the Exception to:
>>>
>>> java.lang.ClassCastException: java.util.ArrayList cannot be cast to
>>> java.util.Vector
>>>  at
>>> org.apache.hadoop.hive.ql.exec.JoinOperator.process(JoinOperator.java:58)
>>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:189)
>>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>>>
>>> Dave
>>>
>>>
>>> On 7/10/09 3:17 PM, "Zheng Shao" <zs...@gmail.com> wrote:
>>>
>>>> Hi David,
>>>>
>>>> Thanks for letting us know. I will take a look now.
>>>>
>>>> In the meanwhile, there is a fix related to types
>>>> https://issues.apache.org/jira/browse/HIVE-624 which might solve the
>>>> problem.
>>>> You might want to try it out.
>>>>
>>>> Zheng
>>>>
>>>> On Fri, Jul 10, 2009 at 11:28 AM, David Lerman<dl...@videoegg.com> wrote:
>>>>> Attempting to join three tables is consistently failing with a
>>>>> ClassCastException using Hive trunk (r792966) and Hadoop 0.18.3.
>>>>>
>>>>> The three tables are defined as follows:
>>>>>
>>>>> create table foo (foo_id int, foo_name string, foo_a string, foo_b string,
>>>>> foo_c string, foo_d string) row format delimited fields terminated by ','
>>>>> stored as textfile;
>>>>>
>>>>> create table bar (bar_id int, bar_0 int, foo_id int, bar_1 int, bar_name
>>>>> string, bar_a string, bar_b string, bar_c string, bar_d string) row format
>>>>> delimited fields terminated by ',' stored as textfile;
>>>>>
>>>>> create table count (bar_id int, n int) row format delimited fields
>>>>> terminated by ',' stored as textfile;
>>>>>
>>>>> Each table has a single row as follows:
>>>>>
>>>>> foo:
>>>>> 1,foo1,a,b,c,d
>>>>>
>>>>> bar:
>>>>> 10,0,1,1,bar10,a,b,c,d
>>>>>
>>>>> counts:
>>>>> 10,2
>>>>>
>>>>> The failing query is:
>>>>>
>>>>> select foo.foo_name, bar.bar_name, n from foo join bar on foo.foo_id =
>>>>> bar.foo_id join count on count.bar_id = bar.bar_id;
>>>>>
>>>>> Interestingly, the query works if you reorder the joins (select
>>>>> foo.foo_name, bar.bar_name, n from count join bar on count.bar_id =
>>>>> bar.bar_id join foo on foo.foo_id = bar.foo_id) or if you remove any of the
>>>>> unused string columns from foo or even just move the unused int columns in
>>>>> bar to the end.
>>>>>
>>>>> The exception is as follows:
>>>>>
>>>>> java.lang.ClassCastException:
>>>>>
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIn>>>>
> s
>>>>> pector cannot be cast to
>>>>>
> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspect>>>>
> o
>>>>> r
>>>>>  at
>>>>>
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize>>>>
> (
>>>>> DynamicSerDeTypeString.java:63)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(>>>>
> D
>>>>> ynamicSerDeFieldList.java:249)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize>>>>
> (
>>>>> DynamicSerDeStructBase.java:81)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSe>>>>
> r
>>>>> De.java:177)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.ja>>>>
> v
>>>>> a:180)
>>>>>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(C>>>>
> o
>>>>> mmonJoinOperator.java:290)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
> t
>>>>> or.java:533)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
> t
>>>>> or.java:522)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
> t
>>>>> or.java:522)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJ>>>>
> o
>>>>> inOperator.java:563)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperat>>>>
> o
>>>>> r.java:545)
>>>>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:159)
>>>>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>>>>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>>>>> java.lang.ClassCastException:
>>>>>
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIn>>>>
> s
>>>>> pector cannot be cast to
>>>>>
> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspect>>>>
> o
>>>>> r
>>>>>  at
>>>>>
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize>>>>
> (
>>>>> DynamicSerDeTypeString.java:63)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(>>>>
> D
>>>>> ynamicSerDeFieldList.java:249)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize>>>>
> (
>>>>> DynamicSerDeStructBase.java:81)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSe>>>>
> r
>>>>> De.java:177)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.ja>>>>
> v
>>>>> a:180)
>>>>>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(C>>>>
> o
>>>>> mmonJoinOperator.java:290)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
> t
>>>>> or.java:533)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
> t
>>>>> or.java:522)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
> t
>>>>> or.java:522)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJ>>>>
> o
>>>>> inOperator.java:563)
>>>>>  at
>>>>>
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperat>>>>
> o
>>>>> r.java:545)
>>>>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:236)
>>>>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:329)
>>>>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>>>>>
>>>>>
>>>>> Thanks for your help!
>>>>>
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Yours,
>>>> Zheng
>>>
>>>
>>
>>
>>
>> --
>> Yours,
>> Zheng
>
>



-- 
Yours,
Zheng

Re: simple join failing with ClassCastException

Posted by David Lerman <dl...@videoegg.com>.
Thanks Zheng,

It actually appears not to build with the patch...

$ svn status
?      HIVE-624.1.patch
$ svn update
At revision 793104.
$ svn info
Path: .
URL: http://svn.apache.org/repos/asf/hadoop/hive/trunk
Repository Root: http://svn.apache.org/repos/asf
Repository UUID: 13f79535-47bb-0310-9956-ffa450edef68
Revision: 793104
Node Kind: directory
Schedule: normal
Last Changed Author: namit
Last Changed Rev: 792772
Last Changed Date: 2009-07-09 21:04:37 -0400 (Thu, 09 Jul 2009)
$ patch -p0 < HIVE-624.1.patch
...
$ ant clean
...
$ ant -Dhadoop.version="0.18.3" package
...
compile:
     [echo] Compiling: hive
    [javac] Compiling 200 source files to
/Users/dlerman/Documents/VideoEgg/repo/vdat/ext_unused/hive_trunk/build/serd
e/classes
    [javac] 
/Users/dlerman/Documents/VideoEgg/repo/vdat/ext_unused/hive_trunk/serde/src/
java/org/apache/hadoop/hive/serde2/lazy/objectinspector/primitive/LazyByteOb
jectInspector.java:30:
org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyByteObjectI
nspector is not abstract and does not override abstract method
set(java.lang.Object,byte) in
org.apache.hadoop.hive.serde2.objectinspector.primitive.ByteObjectInspector
    [javac] public class LazyByteObjectInspector extends
AbstractPrimitiveLazyObjectInspector<ByteWritable>
    [javac]        ^
    [javac] 
/Users/dlerman/Documents/VideoEgg/repo/vdat/ext_unused/hive_trunk/serde/src/
java/org/apache/hadoop/hive/serde2/objectinspector/primitive/JavaByteObjectI
nspector.java:26: 
org.apache.hadoop.hive.serde2.objectinspector.primitive.JavaByteObjectInspec
tor is not abstract and does not override abstract method
set(java.lang.Object,byte) in
org.apache.hadoop.hive.serde2.objectinspector.primitive.ByteObjectInspector
    [javac] public class JavaByteObjectInspector extends
AbstractPrimitiveJavaObjectInspector
    [javac]        ^
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 2 errors

On 7/10/09 4:42 PM, "Zheng Shao" <zs...@gmail.com> wrote:

> Hi David,
> 
> Please do "ant clean" before running "ant -Dhadoop.version=0.18.x package"
> 
> Zheng
> 
> On Fri, Jul 10, 2009 at 12:45 PM, David Lerman<dl...@videoegg.com> wrote:
>> Thanks Zheng,
>> 
>> Applying the 624 patch changes the Exception to:
>> 
>> java.lang.ClassCastException: java.util.ArrayList cannot be cast to
>> java.util.Vector
>>  at
>> org.apache.hadoop.hive.ql.exec.JoinOperator.process(JoinOperator.java:58)
>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:189)
>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>> 
>> Dave
>> 
>> 
>> On 7/10/09 3:17 PM, "Zheng Shao" <zs...@gmail.com> wrote:
>> 
>>> Hi David,
>>> 
>>> Thanks for letting us know. I will take a look now.
>>> 
>>> In the meanwhile, there is a fix related to types
>>> https://issues.apache.org/jira/browse/HIVE-624 which might solve the
>>> problem.
>>> You might want to try it out.
>>> 
>>> Zheng
>>> 
>>> On Fri, Jul 10, 2009 at 11:28 AM, David Lerman<dl...@videoegg.com> wrote:
>>>> Attempting to join three tables is consistently failing with a
>>>> ClassCastException using Hive trunk (r792966) and Hadoop 0.18.3.
>>>> 
>>>> The three tables are defined as follows:
>>>> 
>>>> create table foo (foo_id int, foo_name string, foo_a string, foo_b string,
>>>> foo_c string, foo_d string) row format delimited fields terminated by ','
>>>> stored as textfile;
>>>> 
>>>> create table bar (bar_id int, bar_0 int, foo_id int, bar_1 int, bar_name
>>>> string, bar_a string, bar_b string, bar_c string, bar_d string) row format
>>>> delimited fields terminated by ',' stored as textfile;
>>>> 
>>>> create table count (bar_id int, n int) row format delimited fields
>>>> terminated by ',' stored as textfile;
>>>> 
>>>> Each table has a single row as follows:
>>>> 
>>>> foo:
>>>> 1,foo1,a,b,c,d
>>>> 
>>>> bar:
>>>> 10,0,1,1,bar10,a,b,c,d
>>>> 
>>>> counts:
>>>> 10,2
>>>> 
>>>> The failing query is:
>>>> 
>>>> select foo.foo_name, bar.bar_name, n from foo join bar on foo.foo_id =
>>>> bar.foo_id join count on count.bar_id = bar.bar_id;
>>>> 
>>>> Interestingly, the query works if you reorder the joins (select
>>>> foo.foo_name, bar.bar_name, n from count join bar on count.bar_id =
>>>> bar.bar_id join foo on foo.foo_id = bar.foo_id) or if you remove any of the
>>>> unused string columns from foo or even just move the unused int columns in
>>>> bar to the end.
>>>> 
>>>> The exception is as follows:
>>>> 
>>>> java.lang.ClassCastException:
>>>> 
org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIn>>>>
s
>>>> pector cannot be cast to
>>>> 
org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspect>>>>
o
>>>> r
>>>>  at
>>>> 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize>>>>
(
>>>> DynamicSerDeTypeString.java:63)
>>>>  at
>>>> 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(>>>>
D
>>>> ynamicSerDeFieldList.java:249)
>>>>  at
>>>> 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize>>>>
(
>>>> DynamicSerDeStructBase.java:81)
>>>>  at
>>>> 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSe>>>>
r
>>>> De.java:177)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.ja>>>>
v
>>>> a:180)
>>>>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(C>>>>
o
>>>> mmonJoinOperator.java:290)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
t
>>>> or.java:533)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
t
>>>> or.java:522)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
t
>>>> or.java:522)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJ>>>>
o
>>>> inOperator.java:563)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperat>>>>
o
>>>> r.java:545)
>>>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:159)
>>>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>>>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>>>> java.lang.ClassCastException:
>>>> 
org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIn>>>>
s
>>>> pector cannot be cast to
>>>> 
org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspect>>>>
o
>>>> r
>>>>  at
>>>> 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize>>>>
(
>>>> DynamicSerDeTypeString.java:63)
>>>>  at
>>>> 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(>>>>
D
>>>> ynamicSerDeFieldList.java:249)
>>>>  at
>>>> 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize>>>>
(
>>>> DynamicSerDeStructBase.java:81)
>>>>  at
>>>> 
org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSe>>>>
r
>>>> De.java:177)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.ja>>>>
v
>>>> a:180)
>>>>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(C>>>>
o
>>>> mmonJoinOperator.java:290)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
t
>>>> or.java:533)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
t
>>>> or.java:522)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOpera>>>>
t
>>>> or.java:522)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJ>>>>
o
>>>> inOperator.java:563)
>>>>  at
>>>> 
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperat>>>>
o
>>>> r.java:545)
>>>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:236)
>>>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:329)
>>>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>>>> 
>>>> 
>>>> Thanks for your help!
>>>> 
>>>> 
>>> 
>>> 
>>> 
>>> --
>>> Yours,
>>> Zheng
>> 
>> 
> 
> 
> 
> --
> Yours,
> Zheng


Re: simple join failing with ClassCastException

Posted by Zheng Shao <zs...@gmail.com>.
Hi David,

Please do "ant clean" before running "ant -Dhadoop.version=0.18.x package"

Zheng

On Fri, Jul 10, 2009 at 12:45 PM, David Lerman<dl...@videoegg.com> wrote:
> Thanks Zheng,
>
> Applying the 624 patch changes the Exception to:
>
> java.lang.ClassCastException: java.util.ArrayList cannot be cast to
> java.util.Vector
>  at
> org.apache.hadoop.hive.ql.exec.JoinOperator.process(JoinOperator.java:58)
>  at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:189)
>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>
> Dave
>
>
> On 7/10/09 3:17 PM, "Zheng Shao" <zs...@gmail.com> wrote:
>
>> Hi David,
>>
>> Thanks for letting us know. I will take a look now.
>>
>> In the meanwhile, there is a fix related to types
>> https://issues.apache.org/jira/browse/HIVE-624 which might solve the
>> problem.
>> You might want to try it out.
>>
>> Zheng
>>
>> On Fri, Jul 10, 2009 at 11:28 AM, David Lerman<dl...@videoegg.com> wrote:
>>> Attempting to join three tables is consistently failing with a
>>> ClassCastException using Hive trunk (r792966) and Hadoop 0.18.3.
>>>
>>> The three tables are defined as follows:
>>>
>>> create table foo (foo_id int, foo_name string, foo_a string, foo_b string,
>>> foo_c string, foo_d string) row format delimited fields terminated by ','
>>> stored as textfile;
>>>
>>> create table bar (bar_id int, bar_0 int, foo_id int, bar_1 int, bar_name
>>> string, bar_a string, bar_b string, bar_c string, bar_d string) row format
>>> delimited fields terminated by ',' stored as textfile;
>>>
>>> create table count (bar_id int, n int) row format delimited fields
>>> terminated by ',' stored as textfile;
>>>
>>> Each table has a single row as follows:
>>>
>>> foo:
>>> 1,foo1,a,b,c,d
>>>
>>> bar:
>>> 10,0,1,1,bar10,a,b,c,d
>>>
>>> counts:
>>> 10,2
>>>
>>> The failing query is:
>>>
>>> select foo.foo_name, bar.bar_name, n from foo join bar on foo.foo_id =
>>> bar.foo_id join count on count.bar_id = bar.bar_id;
>>>
>>> Interestingly, the query works if you reorder the joins (select
>>> foo.foo_name, bar.bar_name, n from count join bar on count.bar_id =
>>> bar.bar_id join foo on foo.foo_id = bar.foo_id) or if you remove any of the
>>> unused string columns from foo or even just move the unused int columns in
>>> bar to the end.
>>>
>>> The exception is as follows:
>>>
>>> java.lang.ClassCastException:
>>> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
>>> pector cannot be cast to
>>> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
>>> r
>>>  at
>>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
>>> DynamicSerDeTypeString.java:63)
>>>  at
>>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
>>> ynamicSerDeFieldList.java:249)
>>>  at
>>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
>>> DynamicSerDeStructBase.java:81)
>>>  at
>>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
>>> De.java:177)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
>>> a:180)
>>>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
>>> mmonJoinOperator.java:290)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>>> or.java:533)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>>> or.java:522)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>>> or.java:522)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
>>> inOperator.java:563)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
>>> r.java:545)
>>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:159)
>>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>>> java.lang.ClassCastException:
>>> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
>>> pector cannot be cast to
>>> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
>>> r
>>>  at
>>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
>>> DynamicSerDeTypeString.java:63)
>>>  at
>>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
>>> ynamicSerDeFieldList.java:249)
>>>  at
>>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
>>> DynamicSerDeStructBase.java:81)
>>>  at
>>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
>>> De.java:177)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
>>> a:180)
>>>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
>>> mmonJoinOperator.java:290)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>>> or.java:533)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>>> or.java:522)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>>> or.java:522)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
>>> inOperator.java:563)
>>>  at
>>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
>>> r.java:545)
>>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:236)
>>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:329)
>>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>>>
>>>
>>> Thanks for your help!
>>>
>>>
>>
>>
>>
>> --
>> Yours,
>> Zheng
>
>



-- 
Yours,
Zheng

Re: simple join failing with ClassCastException

Posted by David Lerman <dl...@videoegg.com>.
Thanks Zheng,

Applying the 624 patch changes the Exception to:

java.lang.ClassCastException: java.util.ArrayList cannot be cast to
java.util.Vector
 at 
org.apache.hadoop.hive.ql.exec.JoinOperator.process(JoinOperator.java:58)
 at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:189)
 at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
 at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)

Dave


On 7/10/09 3:17 PM, "Zheng Shao" <zs...@gmail.com> wrote:

> Hi David,
> 
> Thanks for letting us know. I will take a look now.
> 
> In the meanwhile, there is a fix related to types
> https://issues.apache.org/jira/browse/HIVE-624 which might solve the
> problem.
> You might want to try it out.
> 
> Zheng
> 
> On Fri, Jul 10, 2009 at 11:28 AM, David Lerman<dl...@videoegg.com> wrote:
>> Attempting to join three tables is consistently failing with a
>> ClassCastException using Hive trunk (r792966) and Hadoop 0.18.3.
>> 
>> The three tables are defined as follows:
>> 
>> create table foo (foo_id int, foo_name string, foo_a string, foo_b string,
>> foo_c string, foo_d string) row format delimited fields terminated by ','
>> stored as textfile;
>> 
>> create table bar (bar_id int, bar_0 int, foo_id int, bar_1 int, bar_name
>> string, bar_a string, bar_b string, bar_c string, bar_d string) row format
>> delimited fields terminated by ',' stored as textfile;
>> 
>> create table count (bar_id int, n int) row format delimited fields
>> terminated by ',' stored as textfile;
>> 
>> Each table has a single row as follows:
>> 
>> foo:
>> 1,foo1,a,b,c,d
>> 
>> bar:
>> 10,0,1,1,bar10,a,b,c,d
>> 
>> counts:
>> 10,2
>> 
>> The failing query is:
>> 
>> select foo.foo_name, bar.bar_name, n from foo join bar on foo.foo_id =
>> bar.foo_id join count on count.bar_id = bar.bar_id;
>> 
>> Interestingly, the query works if you reorder the joins (select
>> foo.foo_name, bar.bar_name, n from count join bar on count.bar_id =
>> bar.bar_id join foo on foo.foo_id = bar.foo_id) or if you remove any of the
>> unused string columns from foo or even just move the unused int columns in
>> bar to the end.
>> 
>> The exception is as follows:
>> 
>> java.lang.ClassCastException:
>> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
>> pector cannot be cast to
>> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
>> r
>>  at
>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
>> DynamicSerDeTypeString.java:63)
>>  at
>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
>> ynamicSerDeFieldList.java:249)
>>  at
>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
>> DynamicSerDeStructBase.java:81)
>>  at
>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
>> De.java:177)
>>  at
>> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
>> a:180)
>>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
>> mmonJoinOperator.java:290)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>> or.java:533)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>> or.java:522)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>> or.java:522)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
>> inOperator.java:563)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
>> r.java:545)
>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:159)
>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>> java.lang.ClassCastException:
>> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
>> pector cannot be cast to
>> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
>> r
>>  at
>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
>> DynamicSerDeTypeString.java:63)
>>  at
>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
>> ynamicSerDeFieldList.java:249)
>>  at
>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
>> DynamicSerDeStructBase.java:81)
>>  at
>> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
>> De.java:177)
>>  at
>> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
>> a:180)
>>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
>> mmonJoinOperator.java:290)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>> or.java:533)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>> or.java:522)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
>> or.java:522)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
>> inOperator.java:563)
>>  at
>> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
>> r.java:545)
>>  at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:236)
>>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:329)
>>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>> 
>> 
>> Thanks for your help!
>> 
>> 
> 
> 
> 
> --
> Yours,
> Zheng


Re: simple join failing with ClassCastException

Posted by Zheng Shao <zs...@gmail.com>.
Hi David,

Thanks for letting us know. I will take a look now.

In the meanwhile, there is a fix related to types
https://issues.apache.org/jira/browse/HIVE-624 which might solve the
problem.
You might want to try it out.

Zheng

On Fri, Jul 10, 2009 at 11:28 AM, David Lerman<dl...@videoegg.com> wrote:
> Attempting to join three tables is consistently failing with a
> ClassCastException using Hive trunk (r792966) and Hadoop 0.18.3.
>
> The three tables are defined as follows:
>
> create table foo (foo_id int, foo_name string, foo_a string, foo_b string,
> foo_c string, foo_d string) row format delimited fields terminated by ','
> stored as textfile;
>
> create table bar (bar_id int, bar_0 int, foo_id int, bar_1 int, bar_name
> string, bar_a string, bar_b string, bar_c string, bar_d string) row format
> delimited fields terminated by ',' stored as textfile;
>
> create table count (bar_id int, n int) row format delimited fields
> terminated by ',' stored as textfile;
>
> Each table has a single row as follows:
>
> foo:
> 1,foo1,a,b,c,d
>
> bar:
> 10,0,1,1,bar10,a,b,c,d
>
> counts:
> 10,2
>
> The failing query is:
>
> select foo.foo_name, bar.bar_name, n from foo join bar on foo.foo_id =
> bar.foo_id join count on count.bar_id = bar.bar_id;
>
> Interestingly, the query works if you reorder the joins (select
> foo.foo_name, bar.bar_name, n from count join bar on count.bar_id =
> bar.bar_id join foo on foo.foo_id = bar.foo_id) or if you remove any of the
> unused string columns from foo or even just move the unused int columns in
> bar to the end.
>
> The exception is as follows:
>
> java.lang.ClassCastException:
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
> pector cannot be cast to
> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
> r
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
> DynamicSerDeTypeString.java:63)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
> ynamicSerDeFieldList.java:249)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
> DynamicSerDeStructBase.java:81)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
> De.java:177)
>  at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
> a:180)
>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
> mmonJoinOperator.java:290)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:533)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:522)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:522)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
> inOperator.java:563)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
> r.java:545)
>  at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:159)
>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
> java.lang.ClassCastException:
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
> pector cannot be cast to
> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
> r
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
> DynamicSerDeTypeString.java:63)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
> ynamicSerDeFieldList.java:249)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
> DynamicSerDeStructBase.java:81)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
> De.java:177)
>  at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
> a:180)
>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
> mmonJoinOperator.java:290)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:533)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:522)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:522)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
> inOperator.java:563)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
> r.java:545)
>  at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:236)
>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:329)
>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>
>
> Thanks for your help!
>
>



-- 
Yours,
Zheng