You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Gang Luo <lg...@yahoo.com.cn> on 2010/03/16 05:28:23 UTC

TTL of distributed cache

Hi all,
what is the life length of the distributed cache files? Will hadoop redistributed the same file to the same node twice if it is being used by two jobs? 

Thanks,
-Gang


      

RE: WritableName can't load class in hive

Posted by Oded Rotem <od...@gmail.com>.
It's there as well, and still, no luck.

-----Original Message-----
From: Alex Kozlov [mailto:alexvk@cloudera.com] 
Sent: Tuesday, March 16, 2010 8:02 PM
To: common-user@hadoop.apache.org
Subject: Re: WritableName can't load class in hive

Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
and restarting the CLI.

On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com> wrote:

> Yes, I run the CLI from a folder containing the jar in question.
>
> -----Original Message-----
> From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> Sent: Tuesday, March 16, 2010 1:14 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> For some custom functions, I put the jar on the local path accessible to
> the
> CLI. Have you tried that?
>
> Thanks and Regards,
> Sonal
>
>
> On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > We have a bunch of sequence files containing keys & values of custom
> > Writable classes that we wrote, in a HDFS directory.
> >
> > We manage to view them using Hadoop fs -text. For further ad-hoc
> analysis,
> > we tried using Hive. Managed to load them as external tables in Hive,
> > however running a simple select count() against the table fails with
> > "WritableName can't load class" in the job output log.
> >
> > Executing
> >        add jar <path>
> > does not solve it.
> >
> > Where do we need to place the jar containing the definition of the
> writable
> > classes?
> >
> >
>
>


RE: WritableName can't load class in hive

Posted by Oded Rotem <od...@gmail.com>.
No, I didn't specify any SerDe. I'll read up on that and see if it works.

Thanks.

-----Original Message-----
From: Arvind Prabhakar [mailto:arvind@cloudera.com] 
Sent: Wednesday, March 17, 2010 10:40 PM
To: common-user@hadoop.apache.org; hive-user@hadoop.apache.org
Subject: Re: WritableName can't load class in hive

[cross posting to hive-user]

Oded - how did you create the table in Hive? Did you specify any row format
SerDe for the table? If not, then that may be the cause of this problem
since the default LazySimpleSerDe is unable to deserialize the custom
Writable key value pairs that you have used in your file.

-Arvind

On Tue, Mar 16, 2010 at 2:50 PM, Oded Rotem <od...@gmail.com> wrote:

> Actually, now I moved to this error:
>
> java.lang.RuntimeException: org.apache.hadoop.hive.serde2.SerDeException:
> class org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: expects either
> BytesWritable or Text object!
>
> -----Original Message-----
> From: Alex Kozlov [mailto:alexvk@cloudera.com]
> Sent: Tuesday, March 16, 2010 8:02 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
> classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
> and restarting the CLI.
>
> On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > Yes, I run the CLI from a folder containing the jar in question.
> >
> > -----Original Message-----
> > From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> > Sent: Tuesday, March 16, 2010 1:14 PM
> > To: common-user@hadoop.apache.org
> > Subject: Re: WritableName can't load class in hive
> >
> > For some custom functions, I put the jar on the local path accessible to
> > the
> > CLI. Have you tried that?
> >
> > Thanks and Regards,
> > Sonal
> >
> >
> > On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> > wrote:
> >
> > > We have a bunch of sequence files containing keys & values of custom
> > > Writable classes that we wrote, in a HDFS directory.
> > >
> > > We manage to view them using Hadoop fs -text. For further ad-hoc
> > analysis,
> > > we tried using Hive. Managed to load them as external tables in Hive,
> > > however running a simple select count() against the table fails with
> > > "WritableName can't load class" in the job output log.
> > >
> > > Executing
> > >        add jar <path>
> > > does not solve it.
> > >
> > > Where do we need to place the jar containing the definition of the
> > writable
> > > classes?
> > >
> > >
> >
> >
>
>


RE: WritableName can't load class in hive

Posted by Oded Rotem <od...@gmail.com>.
No, I didn't specify any SerDe. I'll read up on that and see if it works.

Thanks.

-----Original Message-----
From: Arvind Prabhakar [mailto:arvind@cloudera.com] 
Sent: Wednesday, March 17, 2010 10:40 PM
To: common-user@hadoop.apache.org; hive-user@hadoop.apache.org
Subject: Re: WritableName can't load class in hive

[cross posting to hive-user]

Oded - how did you create the table in Hive? Did you specify any row format
SerDe for the table? If not, then that may be the cause of this problem
since the default LazySimpleSerDe is unable to deserialize the custom
Writable key value pairs that you have used in your file.

-Arvind

On Tue, Mar 16, 2010 at 2:50 PM, Oded Rotem <od...@gmail.com> wrote:

> Actually, now I moved to this error:
>
> java.lang.RuntimeException: org.apache.hadoop.hive.serde2.SerDeException:
> class org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: expects either
> BytesWritable or Text object!
>
> -----Original Message-----
> From: Alex Kozlov [mailto:alexvk@cloudera.com]
> Sent: Tuesday, March 16, 2010 8:02 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
> classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
> and restarting the CLI.
>
> On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > Yes, I run the CLI from a folder containing the jar in question.
> >
> > -----Original Message-----
> > From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> > Sent: Tuesday, March 16, 2010 1:14 PM
> > To: common-user@hadoop.apache.org
> > Subject: Re: WritableName can't load class in hive
> >
> > For some custom functions, I put the jar on the local path accessible to
> > the
> > CLI. Have you tried that?
> >
> > Thanks and Regards,
> > Sonal
> >
> >
> > On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> > wrote:
> >
> > > We have a bunch of sequence files containing keys & values of custom
> > > Writable classes that we wrote, in a HDFS directory.
> > >
> > > We manage to view them using Hadoop fs -text. For further ad-hoc
> > analysis,
> > > we tried using Hive. Managed to load them as external tables in Hive,
> > > however running a simple select count() against the table fails with
> > > "WritableName can't load class" in the job output log.
> > >
> > > Executing
> > >        add jar <path>
> > > does not solve it.
> > >
> > > Where do we need to place the jar containing the definition of the
> > writable
> > > classes?
> > >
> > >
> >
> >
>
>


Re: WritableName can't load class in hive

Posted by Arvind Prabhakar <ar...@cloudera.com>.
[cross posting to hive-user]

Oded - how did you create the table in Hive? Did you specify any row format
SerDe for the table? If not, then that may be the cause of this problem
since the default LazySimpleSerDe is unable to deserialize the custom
Writable key value pairs that you have used in your file.

-Arvind

On Tue, Mar 16, 2010 at 2:50 PM, Oded Rotem <od...@gmail.com> wrote:

> Actually, now I moved to this error:
>
> java.lang.RuntimeException: org.apache.hadoop.hive.serde2.SerDeException:
> class org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: expects either
> BytesWritable or Text object!
>
> -----Original Message-----
> From: Alex Kozlov [mailto:alexvk@cloudera.com]
> Sent: Tuesday, March 16, 2010 8:02 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
> classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
> and restarting the CLI.
>
> On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > Yes, I run the CLI from a folder containing the jar in question.
> >
> > -----Original Message-----
> > From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> > Sent: Tuesday, March 16, 2010 1:14 PM
> > To: common-user@hadoop.apache.org
> > Subject: Re: WritableName can't load class in hive
> >
> > For some custom functions, I put the jar on the local path accessible to
> > the
> > CLI. Have you tried that?
> >
> > Thanks and Regards,
> > Sonal
> >
> >
> > On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> > wrote:
> >
> > > We have a bunch of sequence files containing keys & values of custom
> > > Writable classes that we wrote, in a HDFS directory.
> > >
> > > We manage to view them using Hadoop fs -text. For further ad-hoc
> > analysis,
> > > we tried using Hive. Managed to load them as external tables in Hive,
> > > however running a simple select count() against the table fails with
> > > "WritableName can't load class" in the job output log.
> > >
> > > Executing
> > >        add jar <path>
> > > does not solve it.
> > >
> > > Where do we need to place the jar containing the definition of the
> > writable
> > > classes?
> > >
> > >
> >
> >
>
>

Re: WritableName can't load class in hive

Posted by Arvind Prabhakar <ar...@cloudera.com>.
[cross posting to hive-user]

Oded - how did you create the table in Hive? Did you specify any row format
SerDe for the table? If not, then that may be the cause of this problem
since the default LazySimpleSerDe is unable to deserialize the custom
Writable key value pairs that you have used in your file.

-Arvind

On Tue, Mar 16, 2010 at 2:50 PM, Oded Rotem <od...@gmail.com> wrote:

> Actually, now I moved to this error:
>
> java.lang.RuntimeException: org.apache.hadoop.hive.serde2.SerDeException:
> class org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: expects either
> BytesWritable or Text object!
>
> -----Original Message-----
> From: Alex Kozlov [mailto:alexvk@cloudera.com]
> Sent: Tuesday, March 16, 2010 8:02 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
> classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
> and restarting the CLI.
>
> On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > Yes, I run the CLI from a folder containing the jar in question.
> >
> > -----Original Message-----
> > From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> > Sent: Tuesday, March 16, 2010 1:14 PM
> > To: common-user@hadoop.apache.org
> > Subject: Re: WritableName can't load class in hive
> >
> > For some custom functions, I put the jar on the local path accessible to
> > the
> > CLI. Have you tried that?
> >
> > Thanks and Regards,
> > Sonal
> >
> >
> > On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> > wrote:
> >
> > > We have a bunch of sequence files containing keys & values of custom
> > > Writable classes that we wrote, in a HDFS directory.
> > >
> > > We manage to view them using Hadoop fs -text. For further ad-hoc
> > analysis,
> > > we tried using Hive. Managed to load them as external tables in Hive,
> > > however running a simple select count() against the table fails with
> > > "WritableName can't load class" in the job output log.
> > >
> > > Executing
> > >        add jar <path>
> > > does not solve it.
> > >
> > > Where do we need to place the jar containing the definition of the
> > writable
> > > classes?
> > >
> > >
> >
> >
>
>

RE: WritableName can't load class in hive

Posted by Oded Rotem <od...@gmail.com>.
Actually, now I moved to this error:

java.lang.RuntimeException: org.apache.hadoop.hive.serde2.SerDeException:
class org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: expects either
BytesWritable or Text object!

-----Original Message-----
From: Alex Kozlov [mailto:alexvk@cloudera.com] 
Sent: Tuesday, March 16, 2010 8:02 PM
To: common-user@hadoop.apache.org
Subject: Re: WritableName can't load class in hive

Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
and restarting the CLI.

On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com> wrote:

> Yes, I run the CLI from a folder containing the jar in question.
>
> -----Original Message-----
> From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> Sent: Tuesday, March 16, 2010 1:14 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> For some custom functions, I put the jar on the local path accessible to
> the
> CLI. Have you tried that?
>
> Thanks and Regards,
> Sonal
>
>
> On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > We have a bunch of sequence files containing keys & values of custom
> > Writable classes that we wrote, in a HDFS directory.
> >
> > We manage to view them using Hadoop fs -text. For further ad-hoc
> analysis,
> > we tried using Hive. Managed to load them as external tables in Hive,
> > however running a simple select count() against the table fails with
> > "WritableName can't load class" in the job output log.
> >
> > Executing
> >        add jar <path>
> > does not solve it.
> >
> > Where do we need to place the jar containing the definition of the
> writable
> > classes?
> >
> >
>
>


Re: WritableName can't load class in hive

Posted by Alex Kozlov <al...@cloudera.com>.
Hive executable will put all jars in HIVE_LIB=${HIVE_HOME}/lib into
classpath.  Try putting your custom jar into the $HIVE_HOME/lib directory
and restarting the CLI.

On Tue, Mar 16, 2010 at 6:28 AM, Oded Rotem <od...@gmail.com> wrote:

> Yes, I run the CLI from a folder containing the jar in question.
>
> -----Original Message-----
> From: Sonal Goyal [mailto:sonalgoyal4@gmail.com]
> Sent: Tuesday, March 16, 2010 1:14 PM
> To: common-user@hadoop.apache.org
> Subject: Re: WritableName can't load class in hive
>
> For some custom functions, I put the jar on the local path accessible to
> the
> CLI. Have you tried that?
>
> Thanks and Regards,
> Sonal
>
>
> On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com>
> wrote:
>
> > We have a bunch of sequence files containing keys & values of custom
> > Writable classes that we wrote, in a HDFS directory.
> >
> > We manage to view them using Hadoop fs -text. For further ad-hoc
> analysis,
> > we tried using Hive. Managed to load them as external tables in Hive,
> > however running a simple select count() against the table fails with
> > "WritableName can't load class" in the job output log.
> >
> > Executing
> >        add jar <path>
> > does not solve it.
> >
> > Where do we need to place the jar containing the definition of the
> writable
> > classes?
> >
> >
>
>

RE: WritableName can't load class in hive

Posted by Oded Rotem <od...@gmail.com>.
Yes, I run the CLI from a folder containing the jar in question.

-----Original Message-----
From: Sonal Goyal [mailto:sonalgoyal4@gmail.com] 
Sent: Tuesday, March 16, 2010 1:14 PM
To: common-user@hadoop.apache.org
Subject: Re: WritableName can't load class in hive

For some custom functions, I put the jar on the local path accessible to the
CLI. Have you tried that?

Thanks and Regards,
Sonal


On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com> wrote:

> We have a bunch of sequence files containing keys & values of custom
> Writable classes that we wrote, in a HDFS directory.
>
> We manage to view them using Hadoop fs -text. For further ad-hoc analysis,
> we tried using Hive. Managed to load them as external tables in Hive,
> however running a simple select count() against the table fails with
> "WritableName can't load class" in the job output log.
>
> Executing
>        add jar <path>
> does not solve it.
>
> Where do we need to place the jar containing the definition of the
writable
> classes?
>
>


Re: WritableName can't load class in hive

Posted by Sonal Goyal <so...@gmail.com>.
For some custom functions, I put the jar on the local path accessible to the
CLI. Have you tried that?

Thanks and Regards,
Sonal


On Tue, Mar 16, 2010 at 3:49 PM, Oded Rotem <od...@gmail.com> wrote:

> We have a bunch of sequence files containing keys & values of custom
> Writable classes that we wrote, in a HDFS directory.
>
> We manage to view them using Hadoop fs -text. For further ad-hoc analysis,
> we tried using Hive. Managed to load them as external tables in Hive,
> however running a simple select count() against the table fails with
> "WritableName can't load class" in the job output log.
>
> Executing
>        add jar <path>
> does not solve it.
>
> Where do we need to place the jar containing the definition of the writable
> classes?
>
>

WritableName can't load class in hive

Posted by Oded Rotem <od...@gmail.com>.
We have a bunch of sequence files containing keys & values of custom Writable classes that we wrote, in a HDFS directory.

We manage to view them using Hadoop fs -text. For further ad-hoc analysis, we tried using Hive. Managed to load them as external tables in Hive, however running a simple select count() against the table fails with "WritableName can't load class" in the job output log. 

Executing 
	add jar <path> 
does not solve it.

Where do we need to place the jar containing the definition of the writable classes?


Re: TTL of distributed cache

Posted by Gang Luo <lg...@yahoo.com.cn>.
It is smart. Thanks Amareshwari.

-Gang


----- 原始邮件 ----
发件人: Amareshwari Sri Ramadasu <am...@yahoo-inc.com>
收件人: "common-user@hadoop.apache.org" <co...@hadoop.apache.org>
发送日期: 2010/3/16 (周二) 11:08:31 下午
主   题: Re: TTL of distributed cache

Hi Gang,
Answers inline.

On 3/16/10 9:58 AM, "Gang Luo" <lg...@yahoo.com.cn> wrote:

Hi all,
what is the life length of the distributed cache files?
Localized cache file will be removed, if the file is not used by any job and localized disk space on the machine goes higher than configured local.cache.size(by default, 10 GB).

Will hadoop redistributed the same file to the same node twice if it is being used by two jobs?
No, It will be localized only once. Both the jobs will use the same localized file. If the file gets modified on DFS, then it will be localized once again.

Thanks
Amareshwari


      

Re: TTL of distributed cache

Posted by Amareshwari Sri Ramadasu <am...@yahoo-inc.com>.
Hi Gang,
Answers inline.

On 3/16/10 9:58 AM, "Gang Luo" <lg...@yahoo.com.cn> wrote:

Hi all,
what is the life length of the distributed cache files?
Localized cache file will be removed, if the file is not used by any job and localized disk space on the machine goes higher than configured local.cache.size(by default, 10 GB).

 Will hadoop redistributed the same file to the same node twice if it is being used by two jobs?
No, It will be localized only once. Both the jobs will use the same localized file. If the file gets modified on DFS, then it will be localized once again.

Thanks
Amareshwari