You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Sambit Tripathy <sa...@gmail.com> on 2012/04/26 10:50:12 UTC

Exceptions with importtsv

Hi All,

Can anyone help me with this exception?

I have been trying to import data from csv files into HBase.

As per my understanding the process is

1. Import  as  HFile using *importtsv *tool provided by HBase
2. Bulkupload the data from those HFiles into HBase using *completebulkupload
*tool.

However when I issue the following command, I encounter exception.

hadoop@srtidev001:/usr/local/hbase> hadoop jar hbase-0.92.1.jar importtsv
-Dimporttsv.bulk.output=/user/hadoop/input.bulk
-Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=, testTable
/opt/hadoop/raw
Exception in thread "main" java.lang.NoClassDefFoundError:
com/google/common/collect/Multimap
        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.ClassNotFoundException:
com.google.common.collect.Multimap
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        ... 6 more

*Note: I have removed the native libraries during hadoop installation. I
doubt if this is causing the exception as it is looking for the "Google
Data Java client".



*Thanks
Sambit.

Re: Exceptions with importtsv

Posted by slim tebourbi <sl...@gmail.com>.
As you use a hbase client in the importer you should have the zookeeper
dependency.
So add it to the job classpath.
I think that you should also add the hbase/zookeeper confs into your
classpath.

For your question on guava, it's used in the parser (the guava splitter).

Slim.

Le 26 avril 2012 11:51, Peter Vandenabeele <pe...@vandenabeele.com> a écrit
:

> On Thu, Apr 26, 2012 at 10:40 AM, Sambit Tripathy <sa...@gmail.com>
> wrote:
> > Slim,
> >
> >
> > That exception is gone now after adding guava jar. (I wonder why do we
> need
> > a Google Data Java Client !!!)
> >
> > Well there is something more, I am getting the following exception now.
> >
> > Exception in thread "main" java.lang.reflect.InvocationTargetException
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> > Caused by: java.lang.NoClassDefFoundError:
> > org/apache/zookeeper/KeeperException
> >        at
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:186)
> >        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
> >        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
> >        at
> >
> org.apache.hadoop.hbase.mapreduce.ImportTsv.createSubmittableJob(ImportTsv.java:220)
> >        at
> > org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:312)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> >
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
> >        at
> > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
> >        ... 10 more
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.zookeeper.KeeperException
> >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >        at java.security.AccessController.doPrivileged(Native Method)
> >        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >        ... 21 more
> >
> >
> > Any idea? Looks like some issues with ZooKeeper, but I checked the logs
> and
> > zookeeper is just fine. This exception message gets printed in the
> console.
>
> I remember seeing that error when setting up hive. The solution there
> was to include /usr/lib/hive/lib/zookeeper-3.3.1.jar.
>
> In this case, you probably need to include /usr/lib/hbase/zookeeper.jar
>
> One primitive way I found to resolve these missing jar problems is to grep
> for the missing class name (browsing the source code for the class
> definition
> would be the proper way, I presume):
>
> peterv@e6500:/usr/lib/hbase/lib$ rgrep KeeperException *
> Binary file zookeeper.jar matches
>
> HTH,
>
> Peter
>

Re: Exceptions with importtsv

Posted by Peter Vandenabeele <pe...@vandenabeele.com>.
On Thu, Apr 26, 2012 at 10:40 AM, Sambit Tripathy <sa...@gmail.com> wrote:
> Slim,
>
>
> That exception is gone now after adding guava jar. (I wonder why do we need
> a Google Data Java Client !!!)
>
> Well there is something more, I am getting the following exception now.
>
> Exception in thread "main" java.lang.reflect.InvocationTargetException
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.lang.NoClassDefFoundError:
> org/apache/zookeeper/KeeperException
>        at
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:186)
>        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
>        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
>        at
> org.apache.hadoop.hbase.mapreduce.ImportTsv.createSubmittableJob(ImportTsv.java:220)
>        at
> org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:312)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>        at
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>        ... 10 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.zookeeper.KeeperException
>        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>        ... 21 more
>
>
> Any idea? Looks like some issues with ZooKeeper, but I checked the logs and
> zookeeper is just fine. This exception message gets printed in the console.

I remember seeing that error when setting up hive. The solution there
was to include /usr/lib/hive/lib/zookeeper-3.3.1.jar.

In this case, you probably need to include /usr/lib/hbase/zookeeper.jar

One primitive way I found to resolve these missing jar problems is to grep
for the missing class name (browsing the source code for the class definition
would be the proper way, I presume):

peterv@e6500:/usr/lib/hbase/lib$ rgrep KeeperException *
Binary file zookeeper.jar matches

HTH,

Peter

Re: Exceptions with importtsv

Posted by Sambit Tripathy <sa...@gmail.com>.
Thanks Yifeng. Well thought input :) and it works.

On Sun, Apr 29, 2012 at 1:43 PM, Yifeng Jiang <up...@gmail.com> wrote:

> Hi Sambit,
>
> Are you specifying a local file system path on the command line?
> Before invoking importtsv, you will need to copy your tsv files to HDFS at
> first.
>
> -Yifeng
>
> On Apr 27, 2012, at 6:08 PM, Sambit Tripathy wrote:
>
> > I am able to run this command but it goes on forever. I don't see any
> data
> > uploaded.
> >
> > This is what I see on the console.
> >
> > http://pastebin.com/J2WApji1
> >
> >
> > Any idea on how to debug this?
> >
> >
> >
> >
> > On Fri, Apr 27, 2012 at 11:30 AM, Sambit Tripathy <sambit19@gmail.com
> >wrote:
> >
> >> Thanks all for the reply.
> >>
> >> I am able to run this.
> >>
> >> HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
> >> ${HBASE_HOME}/hbase-0.92.1.jar importtsv
> >> -Dimporttsv.bulk.output=/user/hadoop/input/bulk
> >> -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=, testTable
> >> /opt/hadoop/raw
> >>
> >>
> >>
> >>
> >> -Sambit.
> >>
> >>
> >> On Thu, Apr 26, 2012 at 4:21 PM, Harsh J <ha...@cloudera.com> wrote:
> >>
> >>> Sambit,
> >>>
> >>> Just a tip:
> >>>
> >>> When using the "hadoop" executable to run HBase programs of any kind,
> >>> the right way is to do this:
> >>>
> >>> HADOOP_CLASSPATH=`hbase classpath` hadoop jar <args>
> >>>
> >>> This will ensure you run with all HBase dependencies loaded on the
> >>> classpath, for code to find its HBase-specific resources.
> >>>
> >>> On Thu, Apr 26, 2012 at 3:10 PM, Sambit Tripathy <sa...@gmail.com>
> >>> wrote:
> >>>> Slim,
> >>>>
> >>>>
> >>>> That exception is gone now after adding guava jar. (I wonder why do we
> >>> need
> >>>> a Google Data Java Client !!!)
> >>>>
> >>>> Well there is something more, I am getting the following exception
> now.
> >>>>
> >>>> Exception in thread "main" java.lang.reflect.InvocationTargetException
> >>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>>       at
> >>>>
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>>>       at
> >>>>
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>>>       at java.lang.reflect.Method.invoke(Method.java:597)
> >>>>       at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
> >>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>>       at
> >>>>
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>>>       at
> >>>>
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>>>       at java.lang.reflect.Method.invoke(Method.java:597)
> >>>>       at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >>>> Caused by: java.lang.NoClassDefFoundError:
> >>>> org/apache/zookeeper/KeeperException
> >>>>       at
> >>>>
> >>>
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:186)
> >>>>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
> >>>>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
> >>>>       at
> >>>>
> >>>
> org.apache.hadoop.hbase.mapreduce.ImportTsv.createSubmittableJob(ImportTsv.java:220)
> >>>>       at
> >>>> org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:312)
> >>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>>       at
> >>>>
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>>>       at
> >>>>
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>>>       at java.lang.reflect.Method.invoke(Method.java:597)
> >>>>       at
> >>>>
> >>>
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
> >>>>       at
> >>>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
> >>>>       ... 10 more
> >>>> Caused by: java.lang.ClassNotFoundException:
> >>>> org.apache.zookeeper.KeeperException
> >>>>       at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >>>>       at java.security.AccessController.doPrivileged(Native Method)
> >>>>       at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >>>>       ... 21 more
> >>>>
> >>>>
> >>>> Any idea? Looks like some issues with ZooKeeper, but I checked the
> logs
> >>> and
> >>>> zookeeper is just fine. This exception message gets printed in the
> >>> console.
> >>>>
> >>>>
> >>>> Thanks
> >>>> Sambit.
> >>>>
> >>>>
> >>>> On Thu, Apr 26, 2012 at 2:25 PM, slim tebourbi <slimtbourbi@gmail.com
> >>>> wrote:
> >>>>
> >>>>> Hi Sambit,
> >>>>> I think that you should add google guava jar to your job classpath.
> >>>>>
> >>>>> Slim.
> >>>>>
> >>>>> Le 26 avril 2012 10:50, Sambit Tripathy <sa...@gmail.com> a
> écrit :
> >>>>>
> >>>>>> Hi All,
> >>>>>>
> >>>>>> Can anyone help me with this exception?
> >>>>>>
> >>>>>> I have been trying to import data from csv files into HBase.
> >>>>>>
> >>>>>> As per my understanding the process is
> >>>>>>
> >>>>>> 1. Import  as  HFile using *importtsv *tool provided by HBase
> >>>>>> 2. Bulkupload the data from those HFiles into HBase using
> >>>>>> *completebulkupload
> >>>>>> *tool.
> >>>>>>
> >>>>>> However when I issue the following command, I encounter exception.
> >>>>>>
> >>>>>> hadoop@srtidev001:/usr/local/hbase> hadoop jar hbase-0.92.1.jar
> >>>>> importtsv
> >>>>>> -Dimporttsv.bulk.output=/user/hadoop/input.bulk
> >>>>>> -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=,
> >>> testTable
> >>>>>> /opt/hadoop/raw
> >>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
> >>>>>> com/google/common/collect/Multimap
> >>>>>>       at
> >>> org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
> >>>>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>>>>       at
> >>>>>>
> >>>>>>
> >>>>>
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >>>>>>       at
> >>>>>>
> >>>>>>
> >>>>>
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >>>>>>       at java.lang.reflect.Method.invoke(Method.java:597)
> >>>>>>       at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >>>>>> Caused by: java.lang.ClassNotFoundException:
> >>>>>> com.google.common.collect.Multimap
> >>>>>>       at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >>>>>>       at java.security.AccessController.doPrivileged(Native Method)
> >>>>>>       at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >>>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >>>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >>>>>>       ... 6 more
> >>>>>>
> >>>>>> *Note: I have removed the native libraries during hadoop
> >>> installation. I
> >>>>>> doubt if this is causing the exception as it is looking for the
> >>> "Google
> >>>>>> Data Java client".
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>> *Thanks
> >>>>>> Sambit.
> >>>>>>
> >>>>>
> >>>
> >>>
> >>>
> >>> --
> >>> Harsh J
> >>>
> >>
> >>
>
>

Re: Exceptions with importtsv

Posted by Yifeng Jiang <up...@gmail.com>.
Hi Sambit,

Are you specifying a local file system path on the command line?
Before invoking importtsv, you will need to copy your tsv files to HDFS at first.

-Yifeng

On Apr 27, 2012, at 6:08 PM, Sambit Tripathy wrote:

> I am able to run this command but it goes on forever. I don't see any data
> uploaded.
> 
> This is what I see on the console.
> 
> http://pastebin.com/J2WApji1
> 
> 
> Any idea on how to debug this?
> 
> 
> 
> 
> On Fri, Apr 27, 2012 at 11:30 AM, Sambit Tripathy <sa...@gmail.com>wrote:
> 
>> Thanks all for the reply.
>> 
>> I am able to run this.
>> 
>> HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
>> ${HBASE_HOME}/hbase-0.92.1.jar importtsv
>> -Dimporttsv.bulk.output=/user/hadoop/input/bulk
>> -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=, testTable
>> /opt/hadoop/raw
>> 
>> 
>> 
>> 
>> -Sambit.
>> 
>> 
>> On Thu, Apr 26, 2012 at 4:21 PM, Harsh J <ha...@cloudera.com> wrote:
>> 
>>> Sambit,
>>> 
>>> Just a tip:
>>> 
>>> When using the "hadoop" executable to run HBase programs of any kind,
>>> the right way is to do this:
>>> 
>>> HADOOP_CLASSPATH=`hbase classpath` hadoop jar <args>
>>> 
>>> This will ensure you run with all HBase dependencies loaded on the
>>> classpath, for code to find its HBase-specific resources.
>>> 
>>> On Thu, Apr 26, 2012 at 3:10 PM, Sambit Tripathy <sa...@gmail.com>
>>> wrote:
>>>> Slim,
>>>> 
>>>> 
>>>> That exception is gone now after adding guava jar. (I wonder why do we
>>> need
>>>> a Google Data Java Client !!!)
>>>> 
>>>> Well there is something more, I am getting the following exception now.
>>>> 
>>>> Exception in thread "main" java.lang.reflect.InvocationTargetException
>>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>       at
>>>> 
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>       at
>>>> 
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>       at java.lang.reflect.Method.invoke(Method.java:597)
>>>>       at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
>>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>       at
>>>> 
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>       at
>>>> 
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>       at java.lang.reflect.Method.invoke(Method.java:597)
>>>>       at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>>> Caused by: java.lang.NoClassDefFoundError:
>>>> org/apache/zookeeper/KeeperException
>>>>       at
>>>> 
>>> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:186)
>>>>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
>>>>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
>>>>       at
>>>> 
>>> org.apache.hadoop.hbase.mapreduce.ImportTsv.createSubmittableJob(ImportTsv.java:220)
>>>>       at
>>>> org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:312)
>>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>       at
>>>> 
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>       at
>>>> 
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>       at java.lang.reflect.Method.invoke(Method.java:597)
>>>>       at
>>>> 
>>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>>>>       at
>>>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>>>>       ... 10 more
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> org.apache.zookeeper.KeeperException
>>>>       at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>       at java.security.AccessController.doPrivileged(Native Method)
>>>>       at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>       ... 21 more
>>>> 
>>>> 
>>>> Any idea? Looks like some issues with ZooKeeper, but I checked the logs
>>> and
>>>> zookeeper is just fine. This exception message gets printed in the
>>> console.
>>>> 
>>>> 
>>>> Thanks
>>>> Sambit.
>>>> 
>>>> 
>>>> On Thu, Apr 26, 2012 at 2:25 PM, slim tebourbi <slimtbourbi@gmail.com
>>>> wrote:
>>>> 
>>>>> Hi Sambit,
>>>>> I think that you should add google guava jar to your job classpath.
>>>>> 
>>>>> Slim.
>>>>> 
>>>>> Le 26 avril 2012 10:50, Sambit Tripathy <sa...@gmail.com> a écrit :
>>>>> 
>>>>>> Hi All,
>>>>>> 
>>>>>> Can anyone help me with this exception?
>>>>>> 
>>>>>> I have been trying to import data from csv files into HBase.
>>>>>> 
>>>>>> As per my understanding the process is
>>>>>> 
>>>>>> 1. Import  as  HFile using *importtsv *tool provided by HBase
>>>>>> 2. Bulkupload the data from those HFiles into HBase using
>>>>>> *completebulkupload
>>>>>> *tool.
>>>>>> 
>>>>>> However when I issue the following command, I encounter exception.
>>>>>> 
>>>>>> hadoop@srtidev001:/usr/local/hbase> hadoop jar hbase-0.92.1.jar
>>>>> importtsv
>>>>>> -Dimporttsv.bulk.output=/user/hadoop/input.bulk
>>>>>> -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=,
>>> testTable
>>>>>> /opt/hadoop/raw
>>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>>> com/google/common/collect/Multimap
>>>>>>       at
>>> org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
>>>>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>       at
>>>>>> 
>>>>>> 
>>>>> 
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>>       at
>>>>>> 
>>>>>> 
>>>>> 
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>>       at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>>       at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>>>>> Caused by: java.lang.ClassNotFoundException:
>>>>>> com.google.common.collect.Multimap
>>>>>>       at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>>>       at java.security.AccessController.doPrivileged(Native Method)
>>>>>>       at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>       ... 6 more
>>>>>> 
>>>>>> *Note: I have removed the native libraries during hadoop
>>> installation. I
>>>>>> doubt if this is causing the exception as it is looking for the
>>> "Google
>>>>>> Data Java client".
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> *Thanks
>>>>>> Sambit.
>>>>>> 
>>>>> 
>>> 
>>> 
>>> 
>>> --
>>> Harsh J
>>> 
>> 
>> 


Re: Exceptions with importtsv

Posted by Sambit Tripathy <sa...@gmail.com>.
I am able to run this command but it goes on forever. I don't see any data
uploaded.

This is what I see on the console.

http://pastebin.com/J2WApji1


Any idea on how to debug this?




On Fri, Apr 27, 2012 at 11:30 AM, Sambit Tripathy <sa...@gmail.com>wrote:

> Thanks all for the reply.
>
> I am able to run this.
>
> HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
> ${HBASE_HOME}/hbase-0.92.1.jar importtsv
> -Dimporttsv.bulk.output=/user/hadoop/input/bulk
> -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=, testTable
> /opt/hadoop/raw
>
>
>
>
> -Sambit.
>
>
> On Thu, Apr 26, 2012 at 4:21 PM, Harsh J <ha...@cloudera.com> wrote:
>
>> Sambit,
>>
>> Just a tip:
>>
>> When using the "hadoop" executable to run HBase programs of any kind,
>> the right way is to do this:
>>
>> HADOOP_CLASSPATH=`hbase classpath` hadoop jar <args>
>>
>> This will ensure you run with all HBase dependencies loaded on the
>> classpath, for code to find its HBase-specific resources.
>>
>> On Thu, Apr 26, 2012 at 3:10 PM, Sambit Tripathy <sa...@gmail.com>
>> wrote:
>> > Slim,
>> >
>> >
>> > That exception is gone now after adding guava jar. (I wonder why do we
>> need
>> > a Google Data Java Client !!!)
>> >
>> > Well there is something more, I am getting the following exception now.
>> >
>> > Exception in thread "main" java.lang.reflect.InvocationTargetException
>> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >        at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >        at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >        at java.lang.reflect.Method.invoke(Method.java:597)
>> >        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
>> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >        at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >        at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >        at java.lang.reflect.Method.invoke(Method.java:597)
>> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> > Caused by: java.lang.NoClassDefFoundError:
>> > org/apache/zookeeper/KeeperException
>> >        at
>> >
>> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:186)
>> >        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
>> >        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
>> >        at
>> >
>> org.apache.hadoop.hbase.mapreduce.ImportTsv.createSubmittableJob(ImportTsv.java:220)
>> >        at
>> > org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:312)
>> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >        at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >        at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >        at java.lang.reflect.Method.invoke(Method.java:597)
>> >        at
>> >
>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>> >        at
>> > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>> >        ... 10 more
>> > Caused by: java.lang.ClassNotFoundException:
>> > org.apache.zookeeper.KeeperException
>> >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> >        at java.security.AccessController.doPrivileged(Native Method)
>> >        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> >        ... 21 more
>> >
>> >
>> > Any idea? Looks like some issues with ZooKeeper, but I checked the logs
>> and
>> > zookeeper is just fine. This exception message gets printed in the
>> console.
>> >
>> >
>> > Thanks
>> > Sambit.
>> >
>> >
>> > On Thu, Apr 26, 2012 at 2:25 PM, slim tebourbi <slimtbourbi@gmail.com
>> >wrote:
>> >
>> >> Hi Sambit,
>> >> I think that you should add google guava jar to your job classpath.
>> >>
>> >> Slim.
>> >>
>> >> Le 26 avril 2012 10:50, Sambit Tripathy <sa...@gmail.com> a écrit :
>> >>
>> >> > Hi All,
>> >> >
>> >> > Can anyone help me with this exception?
>> >> >
>> >> > I have been trying to import data from csv files into HBase.
>> >> >
>> >> > As per my understanding the process is
>> >> >
>> >> > 1. Import  as  HFile using *importtsv *tool provided by HBase
>> >> > 2. Bulkupload the data from those HFiles into HBase using
>> >> > *completebulkupload
>> >> > *tool.
>> >> >
>> >> > However when I issue the following command, I encounter exception.
>> >> >
>> >> > hadoop@srtidev001:/usr/local/hbase> hadoop jar hbase-0.92.1.jar
>> >> importtsv
>> >> > -Dimporttsv.bulk.output=/user/hadoop/input.bulk
>> >> > -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=,
>> testTable
>> >> > /opt/hadoop/raw
>> >> > Exception in thread "main" java.lang.NoClassDefFoundError:
>> >> > com/google/common/collect/Multimap
>> >> >        at
>> org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
>> >> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >> >        at
>> >> >
>> >> >
>> >>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >> >        at
>> >> >
>> >> >
>> >>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >> >        at java.lang.reflect.Method.invoke(Method.java:597)
>> >> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> >> > Caused by: java.lang.ClassNotFoundException:
>> >> > com.google.common.collect.Multimap
>> >> >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> >> >        at java.security.AccessController.doPrivileged(Native Method)
>> >> >        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> >> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> >> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> >> >        ... 6 more
>> >> >
>> >> > *Note: I have removed the native libraries during hadoop
>> installation. I
>> >> > doubt if this is causing the exception as it is looking for the
>> "Google
>> >> > Data Java client".
>> >> >
>> >> >
>> >> >
>> >> > *Thanks
>> >> > Sambit.
>> >> >
>> >>
>>
>>
>>
>> --
>> Harsh J
>>
>
>

Re: Exceptions with importtsv

Posted by Sambit Tripathy <sa...@gmail.com>.
Thanks all for the reply.

I am able to run this.

HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar
${HBASE_HOME}/hbase-0.92.1.jar importtsv
-Dimporttsv.bulk.output=/user/hadoop/input/bulk
-Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=, testTable
/opt/hadoop/raw




-Sambit.

On Thu, Apr 26, 2012 at 4:21 PM, Harsh J <ha...@cloudera.com> wrote:

> Sambit,
>
> Just a tip:
>
> When using the "hadoop" executable to run HBase programs of any kind,
> the right way is to do this:
>
> HADOOP_CLASSPATH=`hbase classpath` hadoop jar <args>
>
> This will ensure you run with all HBase dependencies loaded on the
> classpath, for code to find its HBase-specific resources.
>
> On Thu, Apr 26, 2012 at 3:10 PM, Sambit Tripathy <sa...@gmail.com>
> wrote:
> > Slim,
> >
> >
> > That exception is gone now after adding guava jar. (I wonder why do we
> need
> > a Google Data Java Client !!!)
> >
> > Well there is something more, I am getting the following exception now.
> >
> > Exception in thread "main" java.lang.reflect.InvocationTargetException
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> > Caused by: java.lang.NoClassDefFoundError:
> > org/apache/zookeeper/KeeperException
> >        at
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:186)
> >        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
> >        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
> >        at
> >
> org.apache.hadoop.hbase.mapreduce.ImportTsv.createSubmittableJob(ImportTsv.java:220)
> >        at
> > org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:312)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> >
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
> >        at
> > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
> >        ... 10 more
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.zookeeper.KeeperException
> >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >        at java.security.AccessController.doPrivileged(Native Method)
> >        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >        ... 21 more
> >
> >
> > Any idea? Looks like some issues with ZooKeeper, but I checked the logs
> and
> > zookeeper is just fine. This exception message gets printed in the
> console.
> >
> >
> > Thanks
> > Sambit.
> >
> >
> > On Thu, Apr 26, 2012 at 2:25 PM, slim tebourbi <slimtbourbi@gmail.com
> >wrote:
> >
> >> Hi Sambit,
> >> I think that you should add google guava jar to your job classpath.
> >>
> >> Slim.
> >>
> >> Le 26 avril 2012 10:50, Sambit Tripathy <sa...@gmail.com> a écrit :
> >>
> >> > Hi All,
> >> >
> >> > Can anyone help me with this exception?
> >> >
> >> > I have been trying to import data from csv files into HBase.
> >> >
> >> > As per my understanding the process is
> >> >
> >> > 1. Import  as  HFile using *importtsv *tool provided by HBase
> >> > 2. Bulkupload the data from those HFiles into HBase using
> >> > *completebulkupload
> >> > *tool.
> >> >
> >> > However when I issue the following command, I encounter exception.
> >> >
> >> > hadoop@srtidev001:/usr/local/hbase> hadoop jar hbase-0.92.1.jar
> >> importtsv
> >> > -Dimporttsv.bulk.output=/user/hadoop/input.bulk
> >> > -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=,
> testTable
> >> > /opt/hadoop/raw
> >> > Exception in thread "main" java.lang.NoClassDefFoundError:
> >> > com/google/common/collect/Multimap
> >> >        at
> org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
> >> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> >        at
> >> >
> >> >
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >> >        at
> >> >
> >> >
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >> > Caused by: java.lang.ClassNotFoundException:
> >> > com.google.common.collect.Multimap
> >> >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >> >        at java.security.AccessController.doPrivileged(Native Method)
> >> >        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >> >        ... 6 more
> >> >
> >> > *Note: I have removed the native libraries during hadoop
> installation. I
> >> > doubt if this is causing the exception as it is looking for the
> "Google
> >> > Data Java client".
> >> >
> >> >
> >> >
> >> > *Thanks
> >> > Sambit.
> >> >
> >>
>
>
>
> --
> Harsh J
>

Re: Exceptions with importtsv

Posted by Harsh J <ha...@cloudera.com>.
Sambit,

Just a tip:

When using the "hadoop" executable to run HBase programs of any kind,
the right way is to do this:

HADOOP_CLASSPATH=`hbase classpath` hadoop jar <args>

This will ensure you run with all HBase dependencies loaded on the
classpath, for code to find its HBase-specific resources.

On Thu, Apr 26, 2012 at 3:10 PM, Sambit Tripathy <sa...@gmail.com> wrote:
> Slim,
>
>
> That exception is gone now after adding guava jar. (I wonder why do we need
> a Google Data Java Client !!!)
>
> Well there is something more, I am getting the following exception now.
>
> Exception in thread "main" java.lang.reflect.InvocationTargetException
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.lang.NoClassDefFoundError:
> org/apache/zookeeper/KeeperException
>        at
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:186)
>        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
>        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
>        at
> org.apache.hadoop.hbase.mapreduce.ImportTsv.createSubmittableJob(ImportTsv.java:220)
>        at
> org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:312)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>        at
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>        ... 10 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.zookeeper.KeeperException
>        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>        ... 21 more
>
>
> Any idea? Looks like some issues with ZooKeeper, but I checked the logs and
> zookeeper is just fine. This exception message gets printed in the console.
>
>
> Thanks
> Sambit.
>
>
> On Thu, Apr 26, 2012 at 2:25 PM, slim tebourbi <sl...@gmail.com>wrote:
>
>> Hi Sambit,
>> I think that you should add google guava jar to your job classpath.
>>
>> Slim.
>>
>> Le 26 avril 2012 10:50, Sambit Tripathy <sa...@gmail.com> a écrit :
>>
>> > Hi All,
>> >
>> > Can anyone help me with this exception?
>> >
>> > I have been trying to import data from csv files into HBase.
>> >
>> > As per my understanding the process is
>> >
>> > 1. Import  as  HFile using *importtsv *tool provided by HBase
>> > 2. Bulkupload the data from those HFiles into HBase using
>> > *completebulkupload
>> > *tool.
>> >
>> > However when I issue the following command, I encounter exception.
>> >
>> > hadoop@srtidev001:/usr/local/hbase> hadoop jar hbase-0.92.1.jar
>> importtsv
>> > -Dimporttsv.bulk.output=/user/hadoop/input.bulk
>> > -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=, testTable
>> > /opt/hadoop/raw
>> > Exception in thread "main" java.lang.NoClassDefFoundError:
>> > com/google/common/collect/Multimap
>> >        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
>> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >        at
>> >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >        at
>> >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >        at java.lang.reflect.Method.invoke(Method.java:597)
>> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> > Caused by: java.lang.ClassNotFoundException:
>> > com.google.common.collect.Multimap
>> >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> >        at java.security.AccessController.doPrivileged(Native Method)
>> >        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> >        ... 6 more
>> >
>> > *Note: I have removed the native libraries during hadoop installation. I
>> > doubt if this is causing the exception as it is looking for the "Google
>> > Data Java client".
>> >
>> >
>> >
>> > *Thanks
>> > Sambit.
>> >
>>



-- 
Harsh J

Re: Exceptions with importtsv

Posted by Sambit Tripathy <sa...@gmail.com>.
Slim,


That exception is gone now after adding guava jar. (I wonder why do we need
a Google Data Java Client !!!)

Well there is something more, I am getting the following exception now.

Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.NoClassDefFoundError:
org/apache/zookeeper/KeeperException
        at
org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:186)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
        at
org.apache.hadoop.hbase.mapreduce.ImportTsv.createSubmittableJob(ImportTsv.java:220)
        at
org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:312)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at
org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
        ... 10 more
Caused by: java.lang.ClassNotFoundException:
org.apache.zookeeper.KeeperException
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        ... 21 more


Any idea? Looks like some issues with ZooKeeper, but I checked the logs and
zookeeper is just fine. This exception message gets printed in the console.


Thanks
Sambit.


On Thu, Apr 26, 2012 at 2:25 PM, slim tebourbi <sl...@gmail.com>wrote:

> Hi Sambit,
> I think that you should add google guava jar to your job classpath.
>
> Slim.
>
> Le 26 avril 2012 10:50, Sambit Tripathy <sa...@gmail.com> a écrit :
>
> > Hi All,
> >
> > Can anyone help me with this exception?
> >
> > I have been trying to import data from csv files into HBase.
> >
> > As per my understanding the process is
> >
> > 1. Import  as  HFile using *importtsv *tool provided by HBase
> > 2. Bulkupload the data from those HFiles into HBase using
> > *completebulkupload
> > *tool.
> >
> > However when I issue the following command, I encounter exception.
> >
> > hadoop@srtidev001:/usr/local/hbase> hadoop jar hbase-0.92.1.jar
> importtsv
> > -Dimporttsv.bulk.output=/user/hadoop/input.bulk
> > -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=, testTable
> > /opt/hadoop/raw
> > Exception in thread "main" java.lang.NoClassDefFoundError:
> > com/google/common/collect/Multimap
> >        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> > Caused by: java.lang.ClassNotFoundException:
> > com.google.common.collect.Multimap
> >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >        at java.security.AccessController.doPrivileged(Native Method)
> >        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >        ... 6 more
> >
> > *Note: I have removed the native libraries during hadoop installation. I
> > doubt if this is causing the exception as it is looking for the "Google
> > Data Java client".
> >
> >
> >
> > *Thanks
> > Sambit.
> >
>

Re: Exceptions with importtsv

Posted by slim tebourbi <sl...@gmail.com>.
Hi Sambit,
I think that you should add google guava jar to your job classpath.

Slim.

Le 26 avril 2012 10:50, Sambit Tripathy <sa...@gmail.com> a écrit :

> Hi All,
>
> Can anyone help me with this exception?
>
> I have been trying to import data from csv files into HBase.
>
> As per my understanding the process is
>
> 1. Import  as  HFile using *importtsv *tool provided by HBase
> 2. Bulkupload the data from those HFiles into HBase using
> *completebulkupload
> *tool.
>
> However when I issue the following command, I encounter exception.
>
> hadoop@srtidev001:/usr/local/hbase> hadoop jar hbase-0.92.1.jar importtsv
> -Dimporttsv.bulk.output=/user/hadoop/input.bulk
> -Dimporttsv.columns=HBASE_ROW_KEY,ns: -Dimporttsv.separator=, testTable
> /opt/hadoop/raw
> Exception in thread "main" java.lang.NoClassDefFoundError:
> com/google/common/collect/Multimap
>        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.lang.ClassNotFoundException:
> com.google.common.collect.Multimap
>        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>        ... 6 more
>
> *Note: I have removed the native libraries during hadoop installation. I
> doubt if this is causing the exception as it is looking for the "Google
> Data Java client".
>
>
>
> *Thanks
> Sambit.
>