You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Arvind Prabhakar <ar...@cloudera.com> on 2010/03/17 21:39:40 UTC

Re: WritableName can't load class in hive

[cross posting to hive-user]

Oded - how did you create the table in Hive? Did you specify any row format
SerDe for the table? If not, then that may be the cause of this problem
since the default LazySimpleSerDe is unable to deserialize the custom
Writable key value pairs that you have used in your file.

-Arvind

On Tue, Mar 16, 2010 at 2:50 PM, Oded Rotem <od...@gmail.com> wrote:

> Actually, now I moved to this error:
>
> java.lang.RuntimeException: org.apache.hadoop.hive.serde2.SerDeException:
> class org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: expects either
> BytesWritable or Text object!
>
> -----Original Message-----
> From: Alex Kozlov [mailto:alexvk@cloudera.com]
> Sent: Tuesday, March 16, 2010 8:02 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
> classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
> and restarting the CLI.
>
> On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > Yes, I run the CLI from a folder containing the jar in question.
> >
> > -----Original Message-----
> > From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> > Sent: Tuesday, March 16, 2010 1:14 PM
> > To: common-user@hadoop.apache.org
> > Subject: Re: WritableName can't load class in hive
> >
> > For some custom functions, I put the jar on the local path accessible to
> > the
> > CLI. Have you tried that?
> >
> > Thanks and Regards,
> > Sonal
> >
> >
> > On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> > wrote:
> >
> > > We have a bunch of sequence files containing keys & values of custom
> > > Writable classes that we wrote, in a HDFS directory.
> > >
> > > We manage to view them using Hadoop fs -text. For further ad-hoc
> > analysis,
> > > we tried using Hive. Managed to load them as external tables in Hive,
> > > however running a simple select count() against the table fails with
> > > "WritableName can't load class" in the job output log.
> > >
> > > Executing
> > >        add jar <path>
> > > does not solve it.
> > >
> > > Where do we need to place the jar containing the definition of the
> > writable
> > > classes?
> > >
> > >
> >
> >
>
>

RE: WritableName can't load class in hive

Posted by Oded Rotem <od...@gmail.com>.
No, I didn't specify any SerDe. I'll read up on that and see if it works.

Thanks.

-----Original Message-----
From: Arvind Prabhakar [mailto:arvind@cloudera.com] 
Sent: Wednesday, March 17, 2010 10:40 PM
To: common-user@hadoop.apache.org; hive-user@hadoop.apache.org
Subject: Re: WritableName can't load class in hive

[cross posting to hive-user]

Oded - how did you create the table in Hive? Did you specify any row format
SerDe for the table? If not, then that may be the cause of this problem
since the default LazySimpleSerDe is unable to deserialize the custom
Writable key value pairs that you have used in your file.

-Arvind

On Tue, Mar 16, 2010 at 2:50 PM, Oded Rotem <od...@gmail.com> wrote:

> Actually, now I moved to this error:
>
> java.lang.RuntimeException: org.apache.hadoop.hive.serde2.SerDeException:
> class org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: expects either
> BytesWritable or Text object!
>
> -----Original Message-----
> From: Alex Kozlov [mailto:alexvk@cloudera.com]
> Sent: Tuesday, March 16, 2010 8:02 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
> classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
> and restarting the CLI.
>
> On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > Yes, I run the CLI from a folder containing the jar in question.
> >
> > -----Original Message-----
> > From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> > Sent: Tuesday, March 16, 2010 1:14 PM
> > To: common-user@hadoop.apache.org
> > Subject: Re: WritableName can't load class in hive
> >
> > For some custom functions, I put the jar on the local path accessible to
> > the
> > CLI. Have you tried that?
> >
> > Thanks and Regards,
> > Sonal
> >
> >
> > On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> > wrote:
> >
> > > We have a bunch of sequence files containing keys & values of custom
> > > Writable classes that we wrote, in a HDFS directory.
> > >
> > > We manage to view them using Hadoop fs -text. For further ad-hoc
> > analysis,
> > > we tried using Hive. Managed to load them as external tables in Hive,
> > > however running a simple select count() against the table fails with
> > > "WritableName can't load class" in the job output log.
> > >
> > > Executing
> > >        add jar <path>
> > > does not solve it.
> > >
> > > Where do we need to place the jar containing the definition of the
> > writable
> > > classes?
> > >
> > >
> >
> >
>
>


RE: WritableName can't load class in hive

Posted by Oded Rotem <od...@gmail.com>.
No, I didn't specify any SerDe. I'll read up on that and see if it works.

Thanks.

-----Original Message-----
From: Arvind Prabhakar [mailto:arvind@cloudera.com] 
Sent: Wednesday, March 17, 2010 10:40 PM
To: common-user@hadoop.apache.org; hive-user@hadoop.apache.org
Subject: Re: WritableName can't load class in hive

[cross posting to hive-user]

Oded - how did you create the table in Hive? Did you specify any row format
SerDe for the table? If not, then that may be the cause of this problem
since the default LazySimpleSerDe is unable to deserialize the custom
Writable key value pairs that you have used in your file.

-Arvind

On Tue, Mar 16, 2010 at 2:50 PM, Oded Rotem <od...@gmail.com> wrote:

> Actually, now I moved to this error:
>
> java.lang.RuntimeException: org.apache.hadoop.hive.serde2.SerDeException:
> class org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: expects either
> BytesWritable or Text object!
>
> -----Original Message-----
> From: Alex Kozlov [mailto:alexvk@cloudera.com]
> Sent: Tuesday, March 16, 2010 8:02 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
> classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
> and restarting the CLI.
>
> On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > Yes, I run the CLI from a folder containing the jar in question.
> >
> > -----Original Message-----
> > From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> > Sent: Tuesday, March 16, 2010 1:14 PM
> > To: common-user@hadoop.apache.org
> > Subject: Re: WritableName can't load class in hive
> >
> > For some custom functions, I put the jar on the local path accessible to
> > the
> > CLI. Have you tried that?
> >
> > Thanks and Regards,
> > Sonal
> >
> >
> > On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> > wrote:
> >
> > > We have a bunch of sequence files containing keys & values of custom
> > > Writable classes that we wrote, in a HDFS directory.
> > >
> > > We manage to view them using Hadoop fs -text. For further ad-hoc
> > analysis,
> > > we tried using Hive. Managed to load them as external tables in Hive,
> > > however running a simple select count() against the table fails with
> > > "WritableName can't load class" in the job output log.
> > >
> > > Executing
> > >        add jar <path>
> > > does not solve it.
> > >
> > > Where do we need to place the jar containing the definition of the
> > writable
> > > classes?
> > >
> > >
> >
> >
>
>