You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@accumulo.apache.org by Chris Retford <ch...@gmail.com> on 2013/06/29 00:29:40 UTC

can't initialize on hadoop 2.0

I have performed the following steps:

1) Downloaded the just-released Hortonworks 2 sandbox VM (I have also tried
CDH4 VM, and get the same error). Verified HDFS and ZK up and usable.
2) Downloaded the Accumulo 1.5.0 binary tgz and copied the 512 MB example
conf.
3) Modified accumulo-env.sh with
     HADOOP_HOME=/usr/lib/hadoop,
     HADOOP_CONF_DIR=/etc/hadoop/conf,
     ZOOKEEPER_HOME=/usr/lib/zookeeper
4) ./bin/accumulo init

Got the following error:
Thread "init" died null
java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.accumulo.start.Main$1.run(Main.java:101)
        at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.RuntimeException: java.io.IOException: No FileSystem
for scheme: hdfs
        at
org.apache.accumulo.server.util.Initialize.main(Initialize.java:498)
        ... 6 more
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
        at
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2345)
        at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2352)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:87)
        at
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2391)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2373)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:164)
        at
org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:550)
        at
org.apache.accumulo.server.util.Initialize.main(Initialize.java:485)
        ... 6 more

Re: can't initialize on hadoop 2.0

Posted by Jonathan Hsieh <jo...@cloudera.com>.
I've been working on this against cdh4 (with MR1)

Here's the updated accumulo site values I've been using against a CM
deployed Hadoop2:

 <property>
      <name>general.classpaths</name>
      <value>
    $ACCUMULO_HOME/src/server/target/classes/,
    $ACCUMULO_HOME/src/core/target/classes/,
    $ACCUMULO_HOME/src/start/target/classes/,
    $ACCUMULO_HOME/src/examples/target/classes/,
        $ACCUMULO_HOME/lib/[^.].$ACCUMULO_VERSION.jar,
        $ACCUMULO_HOME/lib/[^.].*.jar,
        $ZOOKEEPER_HOME/zookeeper[^.].*.jar,
        $HADOOP_HOME/conf,
        $HADOOP_HOME/[^.].*.jar,
        $HADOOP_HOME/lib/[^.].*.jar,
        $HADOOP_CONF_DIR,
        /etc/hadoop/conf,
        /opt/cloudera/parcels/CDH/lib/hadoop-0.20-mapreduce/[^.].*.jar,
        /opt/cloudera/parcels/CDH/lib/hadoop-0.20-mapreduce/lib/[^.].*.jar,
        /opt/cloudera/parcels/CDH/lib/hadoop/client-0.20/[^.].*.jar,
        /opt/cloudera/parcels/CDH/lib/hadoop/client-0.20/lib/[^.].*.jar,
      </value>

Hope it helps.

Jon.


On Fri, Jun 28, 2013 at 4:06 PM, Chris Retford <ch...@gmail.com>wrote:

> Yeah, thanks, that was the ticket. In the general.classpaths section of
> accumulo-site.xml, I added /usr/lib/hadoop-hdfs/.*.jar and
> /usr/lib/hadoop-hdfs/lib/.*.jar. I assume I'll have to add the hadoop-yarn
> and -mapreduce folders as well if I want to run jobs.
>
>
> On Fri, Jun 28, 2013 at 4:45 PM, John Vines <vi...@apache.org> wrote:
>
>> IIRC, there's a bit of craziness with the packaging for some of the
>> releases of hadoop2. I believe in them there is a separate hadoop-hdfs in
>> /usr/lib. Add that, with the appropriate expression to the general
>> classpaths option in the accumulo-site.xml file. I don't have the VMs in
>> front of me, so I apologize for not being able to give clearer instructions.
>>
>>
>> On Fri, Jun 28, 2013 at 6:29 PM, Chris Retford <ch...@gmail.com>wrote:
>>
>>> I have performed the following steps:
>>>
>>> 1) Downloaded the just-released Hortonworks 2 sandbox VM (I have also
>>> tried CDH4 VM, and get the same error). Verified HDFS and ZK up and usable.
>>> 2) Downloaded the Accumulo 1.5.0 binary tgz and copied the 512 MB
>>> example conf.
>>> 3) Modified accumulo-env.sh with
>>>      HADOOP_HOME=/usr/lib/hadoop,
>>>      HADOOP_CONF_DIR=/etc/hadoop/conf,
>>>      ZOOKEEPER_HOME=/usr/lib/zookeeper
>>> 4) ./bin/accumulo init
>>>
>>> Got the following error:
>>> Thread "init" died null
>>> java.lang.reflect.InvocationTargetException
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at org.apache.accumulo.start.Main$1.run(Main.java:101)
>>>         at java.lang.Thread.run(Thread.java:662)
>>> Caused by: java.lang.RuntimeException: java.io.IOException: No
>>> FileSystem for scheme: hdfs
>>>         at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:498)
>>>         ... 6 more
>>> Caused by: java.io.IOException: No FileSystem for scheme: hdfs
>>>         at
>>> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2345)
>>>         at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2352)
>>>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:87)
>>>         at
>>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2391)
>>>         at
>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2373)
>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:164)
>>>         at
>>> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:550)
>>>         at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:485)
>>>         ... 6 more
>>>
>>>
>>>
>>
>


-- 
// Jonathan Hsieh (shay)
// Software Engineer, Cloudera
// jon@cloudera.com

Re: can't initialize on hadoop 2.0

Posted by Chris Retford <ch...@gmail.com>.
Yeah, thanks, that was the ticket. In the general.classpaths section of
accumulo-site.xml, I added /usr/lib/hadoop-hdfs/.*.jar and
/usr/lib/hadoop-hdfs/lib/.*.jar. I assume I'll have to add the hadoop-yarn
and -mapreduce folders as well if I want to run jobs.


On Fri, Jun 28, 2013 at 4:45 PM, John Vines <vi...@apache.org> wrote:

> IIRC, there's a bit of craziness with the packaging for some of the
> releases of hadoop2. I believe in them there is a separate hadoop-hdfs in
> /usr/lib. Add that, with the appropriate expression to the general
> classpaths option in the accumulo-site.xml file. I don't have the VMs in
> front of me, so I apologize for not being able to give clearer instructions.
>
>
> On Fri, Jun 28, 2013 at 6:29 PM, Chris Retford <ch...@gmail.com>wrote:
>
>> I have performed the following steps:
>>
>> 1) Downloaded the just-released Hortonworks 2 sandbox VM (I have also
>> tried CDH4 VM, and get the same error). Verified HDFS and ZK up and usable.
>> 2) Downloaded the Accumulo 1.5.0 binary tgz and copied the 512 MB example
>> conf.
>> 3) Modified accumulo-env.sh with
>>      HADOOP_HOME=/usr/lib/hadoop,
>>      HADOOP_CONF_DIR=/etc/hadoop/conf,
>>      ZOOKEEPER_HOME=/usr/lib/zookeeper
>> 4) ./bin/accumulo init
>>
>> Got the following error:
>> Thread "init" died null
>> java.lang.reflect.InvocationTargetException
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.accumulo.start.Main$1.run(Main.java:101)
>>         at java.lang.Thread.run(Thread.java:662)
>> Caused by: java.lang.RuntimeException: java.io.IOException: No FileSystem
>> for scheme: hdfs
>>         at
>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:498)
>>         ... 6 more
>> Caused by: java.io.IOException: No FileSystem for scheme: hdfs
>>         at
>> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2345)
>>         at
>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2352)
>>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:87)
>>         at
>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2391)
>>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2373)
>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:164)
>>         at
>> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:550)
>>         at
>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:485)
>>         ... 6 more
>>
>>
>>
>

Re: can't initialize on hadoop 2.0

Posted by John Vines <vi...@apache.org>.
IIRC, there's a bit of craziness with the packaging for some of the
releases of hadoop2. I believe in them there is a separate hadoop-hdfs in
/usr/lib. Add that, with the appropriate expression to the general
classpaths option in the accumulo-site.xml file. I don't have the VMs in
front of me, so I apologize for not being able to give clearer instructions.


On Fri, Jun 28, 2013 at 6:29 PM, Chris Retford <ch...@gmail.com>wrote:

> I have performed the following steps:
>
> 1) Downloaded the just-released Hortonworks 2 sandbox VM (I have also
> tried CDH4 VM, and get the same error). Verified HDFS and ZK up and usable.
> 2) Downloaded the Accumulo 1.5.0 binary tgz and copied the 512 MB example
> conf.
> 3) Modified accumulo-env.sh with
>      HADOOP_HOME=/usr/lib/hadoop,
>      HADOOP_CONF_DIR=/etc/hadoop/conf,
>      ZOOKEEPER_HOME=/usr/lib/zookeeper
> 4) ./bin/accumulo init
>
> Got the following error:
> Thread "init" died null
> java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.accumulo.start.Main$1.run(Main.java:101)
>         at java.lang.Thread.run(Thread.java:662)
> Caused by: java.lang.RuntimeException: java.io.IOException: No FileSystem
> for scheme: hdfs
>         at
> org.apache.accumulo.server.util.Initialize.main(Initialize.java:498)
>         ... 6 more
> Caused by: java.io.IOException: No FileSystem for scheme: hdfs
>         at
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2345)
>         at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2352)
>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:87)
>         at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2391)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2373)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:164)
>         at
> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:550)
>         at
> org.apache.accumulo.server.util.Initialize.main(Initialize.java:485)
>         ... 6 more
>
>
>