You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@accumulo.apache.org by "Park, Jee [USA]" <Pa...@bah.com> on 2012/07/02 19:21:32 UTC

RE: [External] Re: Need help getting Accumulo running.

Thanks everyone for the responses!

So, I got hadoop to run and installed accumulo following Miguel's email, the
problem now is that when I do 

$ bin/accumulo init

It tries to connect a few times and then times out. Here is what it prints
out.
Just to let you know I did not change anything in the accumulo-site.xml file

Thanks,
Jee

hduser@ubuntu:~/accumulo$ bin/accumulo init 
02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 0 time(s). 
02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 1 time(s). 
02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 2 time(s). 
02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 3 time(s). 
02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 4 time(s). 
02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 5 time(s). 
02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 6 time(s). 
02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 7 time(s). 
02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 8 time(s). 
02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 9 time(s). 
02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException: Call to
localhost/127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused 
java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
connection exception: java.net.ConnectException: Connection refused 
at org.apache.hadoop.ipc.Client.wrapException(Client.java:767) 
at org.apache.hadoop.ipc.Client.call(Client.java:743) 
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) 
at $Proxy0.getProtocolVersion(Unknown Source) 
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359) 
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106) 
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207) 
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170) 
at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
m.java:82) 
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378) 
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) 
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390) 
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196) 
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95) 
at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554) 
at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
) 
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43) 
at java.lang.reflect.Method.invoke(Method.java:601) 
at org.apache.accumulo.start.Main$1.run(Main.java:89) 
at java.lang.Thread.run(Thread.java:722) 
Caused by: java.net.ConnectException: Connection refused 
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701) 
at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
06) 
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404) 
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304) 
at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176) 
at org.apache.hadoop.ipc.Client.getConnection(Client.java:860) 
at org.apache.hadoop.ipc.Client.call(Client.java:720) 
... 20 more 
Thread "init" died null 
java.lang.reflect.InvocationTargetException 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
) 
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43) 
at java.lang.reflect.Method.invoke(Method.java:601) 
at org.apache.accumulo.start.Main$1.run(Main.java:89) 
at java.lang.Thread.run(Thread.java:722) 
Caused by: java.lang.RuntimeException: java.net.ConnectException: Call to
localhost/127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused 
at org.apache.accumulo.server.util.Initialize.main(Initialize.java:436) 
... 6 more 
Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310
failed on connection exception: java.net.ConnectException: Connection
refused 
at org.apache.hadoop.ipc.Client.wrapException(Client.java:767) 
at org.apache.hadoop.ipc.Client.call(Client.java:743) 
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) 
at $Proxy0.getProtocolVersion(Unknown Source) 
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359) 
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106) 
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207) 
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170) 
at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
m.java:82) 
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378) 
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) 
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390) 
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196) 
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95) 
at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554) 
at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426) 
... 6 more 
Caused by: java.net.ConnectException: Connection refused 
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701) 
at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
06) 
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404) 
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304) 
at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176) 
at org.apache.hadoop.ipc.Client.getConnection(Client.java:860) 
at org.apache.hadoop.ipc.Client.call(Client.java:720) 
... 20 more

-----Original Message-----
From: Miguel Pereira [mailto:miguelapereira1@gmail.com] 
Sent: Friday, June 29, 2012 2:59 PM
To: dev@accumulo.apache.org
Subject: [External] Re: Need help getting Accumulo running.

Hi Jee,

I used that same guide to install Accumulo, but I used this guide to install
hadoop.

http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-
node-cluster/

furthermore here are the steps I took to install accumulo were I used
version 1.4.0 and standalone conf.
please note you also need to install java jdk, and set your JAVA_HOME i used
jdk 1.7

Setting up Accumulo


   - git clone     git://github.com/apache/accumulo.git
   - cd accumulo
   - git checkout     tags/1.4.0 -b 1.4.0
   - mvn package && mvn assembly:single -N.             // this can take a
   while
   - cp conf/examples/512MB/standalone/* conf
   - vi accumulo-env.sh


test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
test -z "$HADOOP_HOME" && export
HADOOP_HOME=/home/hduser/developer/workspace/hadoop
test -z "$ZOOKEEPER_HOME" && export
ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5

   - vi     accumulo-site.xml


    modify user, password, secret, memory


   - bin/accumulo     init
   - bin/start-all.sh
   - bin/accumulo     shell -u root

if you get the shell up you know your good.


On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov> wrote:

> We currently don't really support running on Windows. I'm sure there 
> are ways to get it running with Cygwin, but our efforts are better 
> spend in other directions for now.
>
> As for getting it going in Ubuntu, I haven't seen that guide before. 
> Can you let me know where it broke?
>
> For the record, when I was developing ACCUMULO-404, I was working in 
> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
installation.
> They don't do everything for you, but I think if you use 1.4.1 (not 
> sure if I got the debs into 1..4.0), it should diminish the 
> installation work you must do to some minor configuration.
>
> John
>
> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
>
> > Hi, ****
> >
> > ** **
> >
> > I had trouble getting Accumulo to work on a VM instance of Ubuntu 
> > (11.04) using this guide: https://gist.github.com/1535657.****
> >
> > Does anyone have a step-by-step guide to get it running on either 
> > Ubuntu or Windows 7?****
> >
> > ** **
> >
> > Thanks!****
> >
>

Re: [External] Re: Need help getting Accumulo running.

Posted by Jim Klucar <kl...@gmail.com>.
zookeeper is probably not in /bin   You should have $ZOOKEEPER_HOME
set to wherever you installed it:

sudo $ZOOKEEPER_HOME/bin/zkServer.sh start

On Mon, Jul 2, 2012 at 1:46 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
> Haha sudo gave me
>
> sudo: /bin/zkServer.sh: command not found
>
> -----Original Message-----
> From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
> Sent: Monday, July 02, 2012 1:46 PM
> To: dev@accumulo.apache.org
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> sudo :)
>
> On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
>
>> Ah, so I realized I wasn't running hadoop or zookeeper, and so I am
>> running hadoop, but cannot get zookeeper to run Here is what I did:
>>
>> $ $ZOOKEEPER_HOME/bin/zkServer.sh start JMX enabled by default Using
>> config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
>> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
>> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
>> /var/zookeeper/zookeeper_server.pid: Permission denied FAILED TO WRITE
>> PID
>>
>>
>> -----Original Message-----
>> From: Jim Klucar [mailto:klucar@gmail.com]
>> Sent: Monday, July 02, 2012 1:25 PM
>> To: dev@accumulo.apache.org
>> Subject: Re: [External] Re: Need help getting Accumulo running.
>>
>> Did you verify that zookeeper is running?
>>
>> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
>> > Thanks everyone for the responses!
>> >
>> > So, I got hadoop to run and installed accumulo following Miguel's
>> > email, the problem now is that when I do
>> >
>> > $ bin/accumulo init
>> >
>> > It tries to connect a few times and then times out. Here is what it
>> > prints out.
>> > Just to let you know I did not change anything in the
>> > accumulo-site.xml file
>> >
>> > Thanks,
>> > Jee
>> >
>> > hduser@ubuntu:~/accumulo$ bin/accumulo init
>> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 0 time(s).
>> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 1 time(s).
>> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 2 time(s).
>> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 3 time(s).
>> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 4 time(s).
>> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 5 time(s).
>> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 6 time(s).
>> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 7 time(s).
>> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 8 time(s).
>> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 9 time(s).
>> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
>> > Call to
>> > localhost/127.0.0.1:54310 failed on connection exception:
>> > java.net.ConnectException: Connection refused
>> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed
>> > on connection exception: java.net.ConnectException: Connection
>> > refused at
>> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
>> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> > at $Proxy0.getProtocolVersion(Unknown Source) at
>> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>> > at
>> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>> > at
>> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>> > il
>> > eSyste
>> > m.java:82)
>> > at
>> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>> > at
>> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>> > 54
>> > ) at
>> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>> > .j
>> > ava:57
>> > )
>> > at
>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>> > ss
>> > orImpl
>> > .java:43)
>> > at java.lang.reflect.Method.invoke(Method.java:601)
>> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
>> > at java.lang.Thread.run(Thread.java:722)
>> > Caused by: java.net.ConnectException: Connection refused at
>> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>> > 1)
>> > at
>> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>> > java:2
>> > 06)
>> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>> > at
>> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>> > 04
>> > ) at
>> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
>> > ... 20 more
>> > Thread "init" died null
>> > java.lang.reflect.InvocationTargetException
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>> > .j
>> > ava:57
>> > )
>> > at
>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>> > ss
>> > orImpl
>> > .java:43)
>> > at java.lang.reflect.Method.invoke(Method.java:601)
>> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
>> > at java.lang.Thread.run(Thread.java:722)
>> > Caused by: java.lang.RuntimeException: java.net.ConnectException:
>> > Call to
>> > localhost/127.0.0.1:54310 failed on connection exception:
>> > java.net.ConnectException: Connection refused at
>> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
>> > ... 6 more
>> > Caused by: java.net.ConnectException: Call to
>> > localhost/127.0.0.1:54310 failed on connection exception:
>> > java.net.ConnectException: Connection refused at
>> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
>> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> > at $Proxy0.getProtocolVersion(Unknown Source) at
>> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>> > at
>> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>> > at
>> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>> > il
>> > eSyste
>> > m.java:82)
>> > at
>> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>> > at
>> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>> > 54
>> > ) at
>> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>> > ... 6 more
>> > Caused by: java.net.ConnectException: Connection refused at
>> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>> > 1)
>> > at
>> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>> > java:2
>> > 06)
>> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>> > at
>> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>> > 04
>> > ) at
>> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
>> > ... 20 more
>> >
>> > -----Original Message-----
>> > From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
>> > Sent: Friday, June 29, 2012 2:59 PM
>> > To: dev@accumulo.apache.org
>> > Subject: [External] Re: Need help getting Accumulo running.
>> >
>> > Hi Jee,
>> >
>> > I used that same guide to install Accumulo, but I used this guide to
>> > install hadoop.
>> >
>> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux
>> > -s
>> > ingle-
>> > node-cluster/
>> >
>> > furthermore here are the steps I took to install accumulo were I
>> > used version 1.4.0 and standalone conf.
>> > please note you also need to install java jdk, and set your
>> > JAVA_HOME i used jdk 1.7
>> >
>> > Setting up Accumulo
>> >
>> >
>> >    - git clone     git://github.com/apache/accumulo.git
>> >    - cd accumulo
>> >    - git checkout     tags/1.4.0 -b 1.4.0
>> >    - mvn package && mvn assembly:single -N.             // this can
>> > take a
>> >    while
>> >    - cp conf/examples/512MB/standalone/* conf
>> >    - vi accumulo-env.sh
>> >
>> >
>> > test -z "$JAVA_HOME" && export
>> > JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>> > test -z "$HADOOP_HOME" && export
>> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>> > test -z "$ZOOKEEPER_HOME" && export
>> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>> >
>> >    - vi     accumulo-site.xml
>> >
>> >
>> >     modify user, password, secret, memory
>> >
>> >
>> >    - bin/accumulo     init
>> >    - bin/start-all.sh
>> >    - bin/accumulo     shell -u root
>> >
>> > if you get the shell up you know your good.
>> >
>> >
>> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov>
>> wrote:
>> >
>> >> We currently don't really support running on Windows. I'm sure
>> >> there are ways to get it running with Cygwin, but our efforts are
>> >> better spend in other directions for now.
>> >>
>> >> As for getting it going in Ubuntu, I haven't seen that guide before.
>> >> Can you let me know where it broke?
>> >>
>> >> For the record, when I was developing ACCUMULO-404, I was working
>> >> in Ubuntu VMs and I used Apache-BigTop and our debians to
>> >> facilitate
>> > installation.
>> >> They don't do everything for you, but I think if you use 1.4.1 (not
>> >> sure if I got the debs into 1..4.0), it should diminish the
>> >> installation work you must do to some minor configuration.
>> >>
>> >> John
>> >>
>> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com>
>> wrote:
>> >>
>> >> > Hi, ****
>> >> >
>> >> > ** **
>> >> >
>> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>> >> > (11.04) using this guide: https://gist.github.com/1535657.****
>> >> >
>> >> > Does anyone have a step-by-step guide to get it running on either
>> >> > Ubuntu or Windows 7?****
>> >> >
>> >> > ** **
>> >> >
>> >> > Thanks!****
>> >> >
>> >>
>>

RE: [External] Re: Need help getting Accumulo running.

Posted by "Park, Jee [USA]" <Pa...@bah.com>.
Haha sudo gave me 

sudo: /bin/zkServer.sh: command not found

-----Original Message-----
From: Miguel Pereira [mailto:miguelapereira1@gmail.com] 
Sent: Monday, July 02, 2012 1:46 PM
To: dev@accumulo.apache.org
Subject: Re: [External] Re: Need help getting Accumulo running.

sudo :)

On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <Pa...@bah.com> wrote:

> Ah, so I realized I wasn't running hadoop or zookeeper, and so I am 
> running hadoop, but cannot get zookeeper to run Here is what I did:
>
> $ $ZOOKEEPER_HOME/bin/zkServer.sh start JMX enabled by default Using 
> config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
> /var/zookeeper/zookeeper_server.pid: Permission denied FAILED TO WRITE 
> PID
>
>
> -----Original Message-----
> From: Jim Klucar [mailto:klucar@gmail.com]
> Sent: Monday, July 02, 2012 1:25 PM
> To: dev@accumulo.apache.org
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's 
> > email, the problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it 
> > prints out.
> > Just to let you know I did not change anything in the 
> > accumulo-site.xml file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed 
> > on connection exception: java.net.ConnectException: Connection 
> > refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
> > il
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
> > 54
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
> > .j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
> > ss
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused at 
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
> > 1)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
> > 04
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
> > .j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
> > ss
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException: 
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
> > il
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
> > 54
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused at 
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
> > 1)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
> > 04
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: dev@accumulo.apache.org
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to 
> > install hadoop.
> >
> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux
> > -s
> > ingle-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I 
> > used version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your 
> > JAVA_HOME i used jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can
> > take a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export 
> > JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure 
> >> there are ways to get it running with Cygwin, but our efforts are 
> >> better spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working 
> >> in Ubuntu VMs and I used Apache-BigTop and our debians to 
> >> facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not 
> >> sure if I got the debs into 1..4.0), it should diminish the 
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either 
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>

Re: [External] Re: Need help getting Accumulo running.

Posted by Miguel Pereira <mi...@gmail.com>.
sudo :)

On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <Pa...@bah.com> wrote:

> Ah, so I realized I wasn’t running hadoop or zookeeper, and so I am running
> hadoop, but cannot get zookeeper to run
> Here is what I did:
>
> $ $ZOOKEEPER_HOME/bin/zkServer.sh start
> JMX enabled by default
> Using config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
> /var/zookeeper/zookeeper_server.pid: Permission denied
> FAILED TO WRITE PID
>
>
> -----Original Message-----
> From: Jim Klucar [mailto:klucar@gmail.com]
> Sent: Monday, July 02, 2012 1:25 PM
> To: dev@accumulo.apache.org
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's
> > email, the problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it
> > prints out.
> > Just to let you know I did not change anything in the
> > accumulo-site.xml file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
> > connection exception: java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException: Call
> > to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: dev@accumulo.apache.org
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to
> > install hadoop.
> >
> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-s
> > ingle-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I used
> > version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your JAVA_HOME
> > i used jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can
> > take a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure there
> >> are ways to get it running with Cygwin, but our efforts are better
> >> spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working in
> >> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not
> >> sure if I got the debs into 1..4.0), it should diminish the
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>

Re: [External] Re: Need help getting Accumulo running.

Posted by Jim Klucar <kl...@gmail.com>.
a-x will remove execute permissions for all hiding that directory.
just do sudo chmod 777 -R /var/zookeeper to open up permissions

Sent from my iPhone

On Jul 2, 2012, at 3:28 PM, "Park, Jee [USA]" <Pa...@bah.com> wrote:

> Hi, I used sudo chmod a-x /var/zookeeper, and still am getting permission
> denied
> How do I make sure /var/zookeeper is writable?
>
> -----Original Message-----
> From: William Slacum [mailto:wilhelm.von.cloud@accumulo.net]
> Sent: Monday, July 02, 2012 1:45 PM
> To: dev@accumulo.apache.org
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Make sure that /var/zookeeper is writable by the user you're launching
> Zookeeper as. Alternatively, you can reconfigure zookeeper's zoo.cfg file to
> change the directory to somewhere that is writable.
>
> On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
>
>> Ah, so I realized I wasn't running hadoop or zookeeper, and so I am
>> running hadoop, but cannot get zookeeper to run Here is what I did:
>>
>> $ $ZOOKEEPER_HOME/bin/zkServer.sh start JMX enabled by default Using
>> config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
>> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
>> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
>> /var/zookeeper/zookeeper_server.pid: Permission denied FAILED TO WRITE
>> PID
>>
>>
>> -----Original Message-----
>> From: Jim Klucar [mailto:klucar@gmail.com]
>> Sent: Monday, July 02, 2012 1:25 PM
>> To: dev@accumulo.apache.org
>> Subject: Re: [External] Re: Need help getting Accumulo running.
>>
>> Did you verify that zookeeper is running?
>>
>> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
>>> Thanks everyone for the responses!
>>>
>>> So, I got hadoop to run and installed accumulo following Miguel's
>>> email, the problem now is that when I do
>>>
>>> $ bin/accumulo init
>>>
>>> It tries to connect a few times and then times out. Here is what it
>>> prints out.
>>> Just to let you know I did not change anything in the
>>> accumulo-site.xml file
>>>
>>> Thanks,
>>> Jee
>>>
>>> hduser@ubuntu:~/accumulo$ bin/accumulo init
>>> 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 0 time(s).
>>> 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 1 time(s).
>>> 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 2 time(s).
>>> 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 3 time(s).
>>> 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 4 time(s).
>>> 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 5 time(s).
>>> 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 6 time(s).
>>> 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 7 time(s).
>>> 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 8 time(s).
>>> 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 9 time(s).
>>> 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
>>> Call to
>>> localhost/127.0.0.1:54310 failed on connection exception:
>>> java.net.ConnectException: Connection refused
>>> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed
>>> on connection exception: java.net.ConnectException: Connection
>>> refused at
>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>>> at $Proxy0.getProtocolVersion(Unknown Source) at
>>> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>>> at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>>> 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>>> at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>>> il
>>> eSyste
>>> m.java:82)
>>> at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>>> 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>>> at
>>> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>>> 54
>>> ) at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>>> .j
>>> ava:57
>>> )
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>>> ss
>>> orImpl
>>> .java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:601)
>>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>>> at java.lang.Thread.run(Thread.java:722)
>>> Caused by: java.net.ConnectException: Connection refused at
>>> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>>> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>>> 1)
>>> at
>>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>>> java:2
>>> 06)
>>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>>> at
>>> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>>> 04
>>> ) at
>>> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>>> ... 20 more
>>> Thread "init" died null
>>> java.lang.reflect.InvocationTargetException
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>>> .j
>>> ava:57
>>> )
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>>> ss
>>> orImpl
>>> .java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:601)
>>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>>> at java.lang.Thread.run(Thread.java:722)
>>> Caused by: java.lang.RuntimeException: java.net.ConnectException:
>>> Call to
>>> localhost/127.0.0.1:54310 failed on connection exception:
>>> java.net.ConnectException: Connection refused at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
>>> ... 6 more
>>> Caused by: java.net.ConnectException: Call to
>>> localhost/127.0.0.1:54310 failed on connection exception:
>>> java.net.ConnectException: Connection refused at
>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>>> at $Proxy0.getProtocolVersion(Unknown Source) at
>>> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>>> at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>>> 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>>> at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>>> il
>>> eSyste
>>> m.java:82)
>>> at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>>> 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>>> at
>>> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>>> 54
>>> ) at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>>> ... 6 more
>>> Caused by: java.net.ConnectException: Connection refused at
>>> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>>> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>>> 1)
>>> at
>>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>>> java:2
>>> 06)
>>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>>> at
>>> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>>> 04
>>> ) at
>>> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>>> ... 20 more
>>>
>>> -----Original Message-----
>>> From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
>>> Sent: Friday, June 29, 2012 2:59 PM
>>> To: dev@accumulo.apache.org
>>> Subject: [External] Re: Need help getting Accumulo running.
>>>
>>> Hi Jee,
>>>
>>> I used that same guide to install Accumulo, but I used this guide to
>>> install hadoop.
>>>
>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux
>>> -s
>>> ingle-
>>> node-cluster/
>>>
>>> furthermore here are the steps I took to install accumulo were I
>>> used version 1.4.0 and standalone conf.
>>> please note you also need to install java jdk, and set your
>>> JAVA_HOME i used jdk 1.7
>>>
>>> Setting up Accumulo
>>>
>>>
>>>   - git clone     git://github.com/apache/accumulo.git
>>>   - cd accumulo
>>>   - git checkout     tags/1.4.0 -b 1.4.0
>>>   - mvn package && mvn assembly:single -N.             // this can
>>> take a
>>>   while
>>>   - cp conf/examples/512MB/standalone/* conf
>>>   - vi accumulo-env.sh
>>>
>>>
>>> test -z "$JAVA_HOME" && export
>>> JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>>> test -z "$HADOOP_HOME" && export
>>> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>>> test -z "$ZOOKEEPER_HOME" && export
>>> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>>>
>>>   - vi     accumulo-site.xml
>>>
>>>
>>>    modify user, password, secret, memory
>>>
>>>
>>>   - bin/accumulo     init
>>>   - bin/start-all.sh
>>>   - bin/accumulo     shell -u root
>>>
>>> if you get the shell up you know your good.
>>>
>>>
>>> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov>
>> wrote:
>>>
>>>> We currently don't really support running on Windows. I'm sure
>>>> there are ways to get it running with Cygwin, but our efforts are
>>>> better spend in other directions for now.
>>>>
>>>> As for getting it going in Ubuntu, I haven't seen that guide before.
>>>> Can you let me know where it broke?
>>>>
>>>> For the record, when I was developing ACCUMULO-404, I was working
>>>> in Ubuntu VMs and I used Apache-BigTop and our debians to
>>>> facilitate
>>> installation.
>>>> They don't do everything for you, but I think if you use 1.4.1 (not
>>>> sure if I got the debs into 1..4.0), it should diminish the
>>>> installation work you must do to some minor configuration.
>>>>
>>>> John
>>>>
>>>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com>
>> wrote:
>>>>
>>>>> Hi, ****
>>>>>
>>>>> ** **
>>>>>
>>>>> I had trouble getting Accumulo to work on a VM instance of Ubuntu
>>>>> (11.04) using this guide: https://gist.github.com/1535657.****
>>>>>
>>>>> Does anyone have a step-by-step guide to get it running on either
>>>>> Ubuntu or Windows 7?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Thanks!****
>>>>>
>>>>
>>

RE: [External] Re: Need help getting Accumulo running.

Posted by "Park, Jee [USA]" <Pa...@bah.com>.
Hi, I used sudo chmod a-x /var/zookeeper, and still am getting permission
denied
How do I make sure /var/zookeeper is writable?

-----Original Message-----
From: William Slacum [mailto:wilhelm.von.cloud@accumulo.net] 
Sent: Monday, July 02, 2012 1:45 PM
To: dev@accumulo.apache.org
Subject: Re: [External] Re: Need help getting Accumulo running.

Make sure that /var/zookeeper is writable by the user you're launching
Zookeeper as. Alternatively, you can reconfigure zookeeper's zoo.cfg file to
change the directory to somewhere that is writable.

On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <Pa...@bah.com> wrote:

> Ah, so I realized I wasn't running hadoop or zookeeper, and so I am 
> running hadoop, but cannot get zookeeper to run Here is what I did:
>
> $ $ZOOKEEPER_HOME/bin/zkServer.sh start JMX enabled by default Using 
> config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
> /var/zookeeper/zookeeper_server.pid: Permission denied FAILED TO WRITE 
> PID
>
>
> -----Original Message-----
> From: Jim Klucar [mailto:klucar@gmail.com]
> Sent: Monday, July 02, 2012 1:25 PM
> To: dev@accumulo.apache.org
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's 
> > email, the problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it 
> > prints out.
> > Just to let you know I did not change anything in the 
> > accumulo-site.xml file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed 
> > on connection exception: java.net.ConnectException: Connection 
> > refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
> > il
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
> > 54
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
> > .j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
> > ss
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused at 
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
> > 1)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
> > 04
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
> > .j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
> > ss
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException: 
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
> > il
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
> > 54
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused at 
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
> > 1)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
> > 04
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: dev@accumulo.apache.org
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to 
> > install hadoop.
> >
> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux
> > -s
> > ingle-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I 
> > used version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your 
> > JAVA_HOME i used jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can
> > take a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export 
> > JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure 
> >> there are ways to get it running with Cygwin, but our efforts are 
> >> better spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working 
> >> in Ubuntu VMs and I used Apache-BigTop and our debians to 
> >> facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not 
> >> sure if I got the debs into 1..4.0), it should diminish the 
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either 
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>

Re: [External] Re: Need help getting Accumulo running.

Posted by William Slacum <wi...@accumulo.net>.
Make sure that /var/zookeeper is writable by the user you're launching
Zookeeper as. Alternatively, you can reconfigure zookeeper's zoo.cfg file
to change the directory to somewhere that is writable.

On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <Pa...@bah.com> wrote:

> Ah, so I realized I wasn’t running hadoop or zookeeper, and so I am running
> hadoop, but cannot get zookeeper to run
> Here is what I did:
>
> $ $ZOOKEEPER_HOME/bin/zkServer.sh start
> JMX enabled by default
> Using config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
> /var/zookeeper/zookeeper_server.pid: Permission denied
> FAILED TO WRITE PID
>
>
> -----Original Message-----
> From: Jim Klucar [mailto:klucar@gmail.com]
> Sent: Monday, July 02, 2012 1:25 PM
> To: dev@accumulo.apache.org
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's
> > email, the problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it
> > prints out.
> > Just to let you know I did not change anything in the
> > accumulo-site.xml file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
> > connection exception: java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException: Call
> > to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: dev@accumulo.apache.org
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to
> > install hadoop.
> >
> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-s
> > ingle-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I used
> > version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your JAVA_HOME
> > i used jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can
> > take a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure there
> >> are ways to get it running with Cygwin, but our efforts are better
> >> spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working in
> >> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not
> >> sure if I got the debs into 1..4.0), it should diminish the
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>

RE: [External] Re: Need help getting Accumulo running.

Posted by "Park, Jee [USA]" <Pa...@bah.com>.
Ah, so I realized I wasn’t running hadoop or zookeeper, and so I am running
hadoop, but cannot get zookeeper to run
Here is what I did:

$ $ZOOKEEPER_HOME/bin/zkServer.sh start  
JMX enabled by default
Using config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
/usr/lib/zookeeper/bin/zkServer.sh: Cannot create
/var/zookeeper/zookeeper_server.pid: Permission denied
FAILED TO WRITE PID


-----Original Message-----
From: Jim Klucar [mailto:klucar@gmail.com] 
Sent: Monday, July 02, 2012 1:25 PM
To: dev@accumulo.apache.org
Subject: Re: [External] Re: Need help getting Accumulo running.

Did you verify that zookeeper is running?

On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
> Thanks everyone for the responses!
>
> So, I got hadoop to run and installed accumulo following Miguel's 
> email, the problem now is that when I do
>
> $ bin/accumulo init
>
> It tries to connect a few times and then times out. Here is what it 
> prints out.
> Just to let you know I did not change anything in the 
> accumulo-site.xml file
>
> Thanks,
> Jee
>
> hduser@ubuntu:~/accumulo$ bin/accumulo init
> 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 0 time(s).
> 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 1 time(s).
> 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 2 time(s).
> 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 3 time(s).
> 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 4 time(s).
> 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 5 time(s).
> 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 6 time(s).
> 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 7 time(s).
> 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 8 time(s).
> 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 9 time(s).
> 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException: 
> Call to
> localhost/127.0.0.1:54310 failed on connection exception:
> java.net.ConnectException: Connection refused
> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on 
> connection exception: java.net.ConnectException: Connection refused at 
> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source) at 
> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at 
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> eSyste
> m.java:82)
> at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at 
> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> ) at 
> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57
> )
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> orImpl
> .java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.accumulo.start.Main$1.run(Main.java:89)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.net.ConnectException: Connection refused at 
> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> java:2
> 06)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> at 
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> ) at 
> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> at org.apache.hadoop.ipc.Client.call(Client.java:720)
> ... 20 more
> Thread "init" died null
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57
> )
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> orImpl
> .java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.accumulo.start.Main$1.run(Main.java:89)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.lang.RuntimeException: java.net.ConnectException: Call 
> to
> localhost/127.0.0.1:54310 failed on connection exception:
> java.net.ConnectException: Connection refused at 
> org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> ... 6 more
> Caused by: java.net.ConnectException: Call to 
> localhost/127.0.0.1:54310 failed on connection exception: 
> java.net.ConnectException: Connection refused at 
> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source) at 
> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at 
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> eSyste
> m.java:82)
> at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at 
> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> ) at 
> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> ... 6 more
> Caused by: java.net.ConnectException: Connection refused at 
> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> java:2
> 06)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> at 
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> ) at 
> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> at org.apache.hadoop.ipc.Client.call(Client.java:720)
> ... 20 more
>
> -----Original Message-----
> From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
> Sent: Friday, June 29, 2012 2:59 PM
> To: dev@accumulo.apache.org
> Subject: [External] Re: Need help getting Accumulo running.
>
> Hi Jee,
>
> I used that same guide to install Accumulo, but I used this guide to 
> install hadoop.
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-s
> ingle-
> node-cluster/
>
> furthermore here are the steps I took to install accumulo were I used 
> version 1.4.0 and standalone conf.
> please note you also need to install java jdk, and set your JAVA_HOME 
> i used jdk 1.7
>
> Setting up Accumulo
>
>
>    - git clone     git://github.com/apache/accumulo.git
>    - cd accumulo
>    - git checkout     tags/1.4.0 -b 1.4.0
>    - mvn package && mvn assembly:single -N.             // this can 
> take a
>    while
>    - cp conf/examples/512MB/standalone/* conf
>    - vi accumulo-env.sh
>
>
> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> test -z "$HADOOP_HOME" && export
> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> test -z "$ZOOKEEPER_HOME" && export
> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>
>    - vi     accumulo-site.xml
>
>
>     modify user, password, secret, memory
>
>
>    - bin/accumulo     init
>    - bin/start-all.sh
>    - bin/accumulo     shell -u root
>
> if you get the shell up you know your good.
>
>
> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov> wrote:
>
>> We currently don't really support running on Windows. I'm sure there 
>> are ways to get it running with Cygwin, but our efforts are better 
>> spend in other directions for now.
>>
>> As for getting it going in Ubuntu, I haven't seen that guide before.
>> Can you let me know where it broke?
>>
>> For the record, when I was developing ACCUMULO-404, I was working in 
>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> installation.
>> They don't do everything for you, but I think if you use 1.4.1 (not 
>> sure if I got the debs into 1..4.0), it should diminish the 
>> installation work you must do to some minor configuration.
>>
>> John
>>
>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com>
wrote:
>>
>> > Hi, ****
>> >
>> > ** **
>> >
>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>> > (11.04) using this guide: https://gist.github.com/1535657.****
>> >
>> > Does anyone have a step-by-step guide to get it running on either 
>> > Ubuntu or Windows 7?****
>> >
>> > ** **
>> >
>> > Thanks!****
>> >
>>

Re: [External] Re: Need help getting Accumulo running.

Posted by Eric Newton <er...@gmail.com>.
The call stack indicates that Accumulo cannot talk to the Hadoop Name Node.

Verify hadoop is up and running.  Oh, and don't put any of your
Hadoop/Zookeeper data in /tmp, which is cleaned upon a reboot.

-Eric

On Mon, Jul 2, 2012 at 1:24 PM, Jim Klucar <kl...@gmail.com> wrote:
> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
>> Thanks everyone for the responses!
>>
>> So, I got hadoop to run and installed accumulo following Miguel's email, the
>> problem now is that when I do
>>
>> $ bin/accumulo init
>>
>> It tries to connect a few times and then times out. Here is what it prints
>> out.
>> Just to let you know I did not change anything in the accumulo-site.xml file
>>
>> Thanks,
>> Jee
>>
>> hduser@ubuntu:~/accumulo$ bin/accumulo init
>> 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 0 time(s).
>> 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 1 time(s).
>> 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 2 time(s).
>> 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 3 time(s).
>> 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 4 time(s).
>> 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 5 time(s).
>> 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 6 time(s).
>> 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 7 time(s).
>> 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 8 time(s).
>> 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 9 time(s).
>> 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException: Call to
>> localhost/127.0.0.1:54310 failed on connection exception:
>> java.net.ConnectException: Connection refused
>> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
>> connection exception: java.net.ConnectException: Connection refused
>> at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> at $Proxy0.getProtocolVersion(Unknown Source)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>> at
>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
>> m.java:82)
>> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>> at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
>> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
>> )
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
>> .java:43)
>> at java.lang.reflect.Method.invoke(Method.java:601)
>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>> at java.lang.Thread.run(Thread.java:722)
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
>> 06)
>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
>> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>> ... 20 more
>> Thread "init" died null
>> java.lang.reflect.InvocationTargetException
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
>> )
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
>> .java:43)
>> at java.lang.reflect.Method.invoke(Method.java:601)
>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>> at java.lang.Thread.run(Thread.java:722)
>> Caused by: java.lang.RuntimeException: java.net.ConnectException: Call to
>> localhost/127.0.0.1:54310 failed on connection exception:
>> java.net.ConnectException: Connection refused
>> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
>> ... 6 more
>> Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310
>> failed on connection exception: java.net.ConnectException: Connection
>> refused
>> at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> at $Proxy0.getProtocolVersion(Unknown Source)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>> at
>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
>> m.java:82)
>> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>> at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
>> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>> ... 6 more
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
>> 06)
>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
>> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>> ... 20 more
>>
>> -----Original Message-----
>> From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
>> Sent: Friday, June 29, 2012 2:59 PM
>> To: dev@accumulo.apache.org
>> Subject: [External] Re: Need help getting Accumulo running.
>>
>> Hi Jee,
>>
>> I used that same guide to install Accumulo, but I used this guide to install
>> hadoop.
>>
>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-
>> node-cluster/
>>
>> furthermore here are the steps I took to install accumulo were I used
>> version 1.4.0 and standalone conf.
>> please note you also need to install java jdk, and set your JAVA_HOME i used
>> jdk 1.7
>>
>> Setting up Accumulo
>>
>>
>>    - git clone     git://github.com/apache/accumulo.git
>>    - cd accumulo
>>    - git checkout     tags/1.4.0 -b 1.4.0
>>    - mvn package && mvn assembly:single -N.             // this can take a
>>    while
>>    - cp conf/examples/512MB/standalone/* conf
>>    - vi accumulo-env.sh
>>
>>
>> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>> test -z "$HADOOP_HOME" && export
>> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>> test -z "$ZOOKEEPER_HOME" && export
>> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>>
>>    - vi     accumulo-site.xml
>>
>>
>>     modify user, password, secret, memory
>>
>>
>>    - bin/accumulo     init
>>    - bin/start-all.sh
>>    - bin/accumulo     shell -u root
>>
>> if you get the shell up you know your good.
>>
>>
>> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov> wrote:
>>
>>> We currently don't really support running on Windows. I'm sure there
>>> are ways to get it running with Cygwin, but our efforts are better
>>> spend in other directions for now.
>>>
>>> As for getting it going in Ubuntu, I haven't seen that guide before.
>>> Can you let me know where it broke?
>>>
>>> For the record, when I was developing ACCUMULO-404, I was working in
>>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
>> installation.
>>> They don't do everything for you, but I think if you use 1.4.1 (not
>>> sure if I got the debs into 1..4.0), it should diminish the
>>> installation work you must do to some minor configuration.
>>>
>>> John
>>>
>>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
>>>
>>> > Hi, ****
>>> >
>>> > ** **
>>> >
>>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>>> > (11.04) using this guide: https://gist.github.com/1535657.****
>>> >
>>> > Does anyone have a step-by-step guide to get it running on either
>>> > Ubuntu or Windows 7?****
>>> >
>>> > ** **
>>> >
>>> > Thanks!****
>>> >
>>>

Re: [External] Re: Need help getting Accumulo running.

Posted by Miguel Pereira <mi...@gmail.com>.
Jee, I think I got that error before, 2 things

1) what Jim said
2) make sure that your environment variables are setup properbly, in my
example the HOME directories might not match up with yours
3) make sure your hadoop file hdfs-site.xml in the hadoop/conf dir has the
the name and data directories set up.. this screwed me up before

<property>
  <name>dfs.name.dir</name>
  <value>/home/blue/dfs/name</value>
  <description>
  </description>
</property>

<property>
  <name>dfs.data.dir</name>
  <value>/home/blue/dfs/data</value>
  <description>
  </description>
</property>

the directories might not match your environment, you can basically use
what ever you want.
then reformat your name node and restart accumulo

hope that helps
- Miguel

On Mon, Jul 2, 2012 at 1:24 PM, Jim Klucar <kl...@gmail.com> wrote:

> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's email,
> the
> > problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it
> prints
> > out.
> > Just to let you know I did not change anything in the accumulo-site.xml
> file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException: Call
> to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
> > connection exception: java.net.ConnectException: Connection refused
> > at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source)
> > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
> > m.java:82)
> > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
> > at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
> > )
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused
> > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> > at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> >
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
> > at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
> > )
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > at org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310
> > failed on connection exception: java.net.ConnectException: Connection
> > refused
> > at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source)
> > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
> > m.java:82)
> > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
> > at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused
> > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> > at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> >
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
> > at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: dev@accumulo.apache.org
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to
> install
> > hadoop.
> >
> >
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I used
> > version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your JAVA_HOME i
> used
> > jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can take
> a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure there
> >> are ways to get it running with Cygwin, but our efforts are better
> >> spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working in
> >> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not
> >> sure if I got the debs into 1..4.0), it should diminish the
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>

Re: [External] Re: Need help getting Accumulo running.

Posted by Jim Klucar <kl...@gmail.com>.
Did you verify that zookeeper is running?

On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
> Thanks everyone for the responses!
>
> So, I got hadoop to run and installed accumulo following Miguel's email, the
> problem now is that when I do
>
> $ bin/accumulo init
>
> It tries to connect a few times and then times out. Here is what it prints
> out.
> Just to let you know I did not change anything in the accumulo-site.xml file
>
> Thanks,
> Jee
>
> hduser@ubuntu:~/accumulo$ bin/accumulo init
> 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 0 time(s).
> 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 1 time(s).
> 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 2 time(s).
> 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 3 time(s).
> 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 4 time(s).
> 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 5 time(s).
> 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 6 time(s).
> 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 7 time(s).
> 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 8 time(s).
> 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 9 time(s).
> 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException: Call to
> localhost/127.0.0.1:54310 failed on connection exception:
> java.net.ConnectException: Connection refused
> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
> connection exception: java.net.ConnectException: Connection refused
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
> m.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
> )
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.accumulo.start.Main$1.run(Main.java:89)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
> 06)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> at org.apache.hadoop.ipc.Client.call(Client.java:720)
> ... 20 more
> Thread "init" died null
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
> )
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.accumulo.start.Main$1.run(Main.java:89)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.lang.RuntimeException: java.net.ConnectException: Call to
> localhost/127.0.0.1:54310 failed on connection exception:
> java.net.ConnectException: Connection refused
> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> ... 6 more
> Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310
> failed on connection exception: java.net.ConnectException: Connection
> refused
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
> m.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> ... 6 more
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
> 06)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> at org.apache.hadoop.ipc.Client.call(Client.java:720)
> ... 20 more
>
> -----Original Message-----
> From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
> Sent: Friday, June 29, 2012 2:59 PM
> To: dev@accumulo.apache.org
> Subject: [External] Re: Need help getting Accumulo running.
>
> Hi Jee,
>
> I used that same guide to install Accumulo, but I used this guide to install
> hadoop.
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-
> node-cluster/
>
> furthermore here are the steps I took to install accumulo were I used
> version 1.4.0 and standalone conf.
> please note you also need to install java jdk, and set your JAVA_HOME i used
> jdk 1.7
>
> Setting up Accumulo
>
>
>    - git clone     git://github.com/apache/accumulo.git
>    - cd accumulo
>    - git checkout     tags/1.4.0 -b 1.4.0
>    - mvn package && mvn assembly:single -N.             // this can take a
>    while
>    - cp conf/examples/512MB/standalone/* conf
>    - vi accumulo-env.sh
>
>
> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> test -z "$HADOOP_HOME" && export
> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> test -z "$ZOOKEEPER_HOME" && export
> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>
>    - vi     accumulo-site.xml
>
>
>     modify user, password, secret, memory
>
>
>    - bin/accumulo     init
>    - bin/start-all.sh
>    - bin/accumulo     shell -u root
>
> if you get the shell up you know your good.
>
>
> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <jo...@ugov.gov> wrote:
>
>> We currently don't really support running on Windows. I'm sure there
>> are ways to get it running with Cygwin, but our efforts are better
>> spend in other directions for now.
>>
>> As for getting it going in Ubuntu, I haven't seen that guide before.
>> Can you let me know where it broke?
>>
>> For the record, when I was developing ACCUMULO-404, I was working in
>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> installation.
>> They don't do everything for you, but I think if you use 1.4.1 (not
>> sure if I got the debs into 1..4.0), it should diminish the
>> installation work you must do to some minor configuration.
>>
>> John
>>
>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Pa...@bah.com> wrote:
>>
>> > Hi, ****
>> >
>> > ** **
>> >
>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>> > (11.04) using this guide: https://gist.github.com/1535657.****
>> >
>> > Does anyone have a step-by-step guide to get it running on either
>> > Ubuntu or Windows 7?****
>> >
>> > ** **
>> >
>> > Thanks!****
>> >
>>