You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Vicky Kak <vi...@gmail.com> on 2014/03/24 11:02:45 UTC

Fwd: Setting Hadoop on LinuxContainers Fails.

Hi All,

I am using linuxcontainer(http://linuxcontainers.org/) for configuring the
hadoop cluster for the testing.
I have create two linux application containers which are called
hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
with hadoop2 is 10.0.3.201.

I am able to start the Namenode on 10.0.3.200 but when i try to start the
DataNode on 10.0.3.201 I see the following error at 10.0.3.201

****************************************************************************************
$ hdfs datanode
14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = Hadoop2/10.0.3.148
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.2.0
STARTUP_MSG:   classpath =
/home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r
1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
STARTUP_MSG:   java = 1.7.0
************************************************************/
14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers
for [TERM, HUP, INT]
14/03/24 09:30:57 WARN common.Util: Path
/home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
configuration files. Please update hdfs configuration.
14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
hadoop-metrics2.properties
14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period at
10 second(s).
14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
started
14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
0.0.0.0:50010
14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
bytes/s
14/03/24 09:30:58 INFO mortbay.log: Logging to
org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
org.mortbay.log.Slf4jLog
14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
(class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context datanode
14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context logs
14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
context static
14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
localhost:50075
14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
14/03/24 09:30:59 INFO mortbay.log: Started SelectChannelConnector@localhost
:50075
14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port 50020
14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
0.0.0.0:50020
14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
nameservices: null
14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
nameservices: <default>
14/03/24 09:30:59 WARN common.Util: Path
/home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
configuration files. Please update hdfs configuration.
14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering> (storage
id unknown) service to /10.0.3.200:9000 starting to offer service
14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
14/03/24 09:30:59 INFO common.Storage: Lock on
/home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
nodename 2618@Hadoop2
14/03/24 09:31:00 INFO common.Storage: Locking is disabled
14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
/home/ubuntu/dallaybatta-data/hdfs/datanode/current
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree
Verification scan starting at 1395674259100 with interval 21600000
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
BP-1489452897-10.0.3.253-1395650301038
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
BP-1489452897-10.0.3.253-1395650301038 on volume
/home/ubuntu/dallaybatta-data/hdfs/datanode/current...
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
BP-1489452897-10.0.3.253-1395650301038 on
/home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all replicas
for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for block
pool BP-1489452897-10.0.3.253-1395650301038 on volume
/home/ubuntu/dallaybatta-data/hdfs/datanode/current...
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map for
block pool BP-1489452897-10.0.3.253-1395650301038 on volume
/home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas
to map: 1ms
14/03/24 09:31:00 INFO datanode.DataNode: Block pool
BP-1489452897-10.0.3.253-1395650301038 (storage id
DS-1380795562-10.0.3.201-50010-1395650455122) service to
/10.0.3.200:9000beginning handshake with NN
14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for block
pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
ipcPort=50020,
storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
    at
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
    at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
    at
org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
    at
org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
    at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)

    at org.apache.hadoop.ipc.Client.call(Client.java:1347)
    at org.apache.hadoop.ipc.Client.call(Client.java:1300)
    at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at $Proxy9.registerDatanode(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at $Proxy9.registerDatanode(Unknown Source)
    at
org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
    at
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
    at
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
    at
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
    at java.lang.Thread.run(Thread.java:722)
14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
BP-1489452897-10.0.3.253-1395650301038 (storage id
DS-1380795562-10.0.3.201-50010-1395650455122)
14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
BP-1489452897-10.0.3.253-1395650301038
14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
*SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
<http://10.0.3.148>*
************************************************************/

****************************************************************************************


And here is the corresponding error coming at NameNode( 10.0.3.201)

****************************************************************************************
14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved datanode
registration from 10.0.3.201
14/03/24 09:31:00 ERROR security.UserGroupInformation:
PriviledgedActionException as:ubuntu (auth:SIMPLE)
cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
ipcPort=50020,
storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
from 10.0.3.201:60951 Call#1 Retry#0: error:
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
ipcPort=50020,
storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
ipcPort=50020,
storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
    at
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
    at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
    at
org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
    at
org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
    at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
****************************************************************************************

I don't know from where *10.0.3.148 *
 ip is coming yet, could be due to some lxc configurations. What can be
interpreted from the hadoop error information?

Let me know if you need more info about my environment to provide some
insights.

Regards,
Vicky

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Vicky Kak <vi...@gmail.com>.
Hi Mingjiang,


I resolved the issue partially after spending a day over it.

The issue was coming due to multiple IP's getting assigned to the
linuxcontainer when cloning the existing container. I did manually  made
changed in the container configuration file, check this

sudo cat /var/lib/lxc/Hadoop2/config
lxc.network.type = veth
lxc.network.link = lxcbr0
lxc.network.flags = up
lxc.network.hwaddr = 00:16:3e:cd:ca:09
lxc.devttydir = lxc
lxc.tty = 4
lxc.pts = 1024
lxc.arch = i686
lxc.cap.drop = sys_module mac_admin
lxc.pivotdir = lxc_putold
lxc.cgroup.memory.limit_in_bytes = 3990M
lxc.network.ipv4 = 10.0.3.148 ( It was 10.0.3.201)
lxc.utsname = Hadoop2
lxc.mount = /var/lib/lxc/Hadoop2/fstab
lxc.rootfs = /var/lib/lxc/Hadoop2/rootfs

I modified the lxc.network.ipv4 to the same IP that I was seeing in the
SHUTDOWN_MSG

14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
/*****************************
*******************************
*SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
<http://10.0.3.148>*
************************************************************/

Needless to say I have to make changes in the hadoop config files like
core-site.xml/hdfs-site.xml appropriately.

Regards,
Vicky


On Tue, Mar 25, 2014 at 8:07 AM, Mingjiang Shi <ms...@gopivotal.com> wrote:

> Hi Vicky,
> Do you use dhcp or assign ip address statically to the containers?
> Suggest you assign static ip address to the container instead of using
> dhcp.
>
>
>
> On Mon, Mar 24, 2014 at 11:19 PM, Vicky Kak <vi...@gmail.com> wrote:
>
>> Yep, they can see each other and the outside world.
>> My issue seems to appearing from the cached IP in
>>
>> /var/lib/misc/dnsmasq.lxcbr0.leases
>>
>>
>>
>>
>>
>>
>> On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:
>>
>>> are your linux containers networked properly (i.e. can they see each
>>> other, and the outside world, etc...)
>>> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>>>
>>>
>>> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>>>
>>>> Hi All,
>>>>
>>>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>>>> the hadoop cluster for the testing.
>>>> I have create two linux application containers which are called
>>>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>>>> with hadoop2 is 10.0.3.201.
>>>>
>>>> I am able to start the Namenode on 10.0.3.200 but when i try to start
>>>> the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>>>
>>>>
>>>> ****************************************************************************************
>>>> $ hdfs datanode
>>>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting DataNode
>>>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>>>> STARTUP_MSG:   args = []
>>>> STARTUP_MSG:   version = 2.2.0
>>>> STARTUP_MSG:   classpath =
>>>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common-r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>>>> STARTUP_MSG:   java = 1.7.0
>>>> ************************************************************/
>>>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal
>>>> handlers for [TERM, HUP, INT]
>>>> 14/03/24 09:30:57 WARN common.Util: Path
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>>> configuration files. Please update hdfs configuration.
>>>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>>>> hadoop-metrics2.properties
>>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot
>>>> period at 10 second(s).
>>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>>>> started
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>>>> 0.0.0.0:50010
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>>>> bytes/s
>>>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>> org.mortbay.log.Slf4jLog
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>>>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context datanode
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context logs
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context static
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>>>> localhost:50075
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>>>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>>>> 14/03/24 09:30:59 INFO mortbay.log: Started
>>>> SelectChannelConnector@localhost:50075
>>>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>>>> 50020
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>>>> 0.0.0.0:50020
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>>>> nameservices: null
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>>>> nameservices: <default>
>>>> 14/03/24 09:30:59 WARN common.Util: Path
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>>> configuration files. Please update hdfs configuration.
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>>>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>>>> service
>>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020:
>>>> starting
>>>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>>>> nodename 2618@Hadoop2
>>>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>>>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState
>>>> MBean
>>>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory
>>>> Tree Verification scan starting at 1395674259100 with interval 21600000
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>>>> BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block
>>>> pool BP-1489452897-10.0.3.253-1395650301038 on
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>>>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>>>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>>>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all
>>>> replicas to map: 1ms
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000 beginning handshake with NN
>>>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>>>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000
>>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>>
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>     at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>>>     at
>>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>>>     at
>>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>>>     at java.lang.Thread.run(Thread.java:722)
>>>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service
>>>> for: Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122)
>>>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>>>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>>>> BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>>>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>>>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>>>> <http://10.0.3.148>*
>>>> ************************************************************/
>>>>
>>>>
>>>> ****************************************************************************************
>>>>
>>>>
>>>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>>>
>>>>
>>>> ****************************************************************************************
>>>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>>>> datanode registration from 10.0.3.201
>>>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>>>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>>>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>>>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>>>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>>
>>>> ****************************************************************************************
>>>>
>>>> I don't know from where *10.0.3.148 *
>>>>  ip is coming yet, could be due to some lxc configurations. What can be
>>>> interpreted from the hadoop error information?
>>>>
>>>> Let me know if you need more info about my environment to provide some
>>>> insights.
>>>>
>>>> Regards,
>>>> Vicky
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Jay Vyas
>>> http://jayunit100.blogspot.com
>>>
>>
>>
>
>
> --
> Cheers
> -MJ
>

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Vicky Kak <vi...@gmail.com>.
Hi Mingjiang,


I resolved the issue partially after spending a day over it.

The issue was coming due to multiple IP's getting assigned to the
linuxcontainer when cloning the existing container. I did manually  made
changed in the container configuration file, check this

sudo cat /var/lib/lxc/Hadoop2/config
lxc.network.type = veth
lxc.network.link = lxcbr0
lxc.network.flags = up
lxc.network.hwaddr = 00:16:3e:cd:ca:09
lxc.devttydir = lxc
lxc.tty = 4
lxc.pts = 1024
lxc.arch = i686
lxc.cap.drop = sys_module mac_admin
lxc.pivotdir = lxc_putold
lxc.cgroup.memory.limit_in_bytes = 3990M
lxc.network.ipv4 = 10.0.3.148 ( It was 10.0.3.201)
lxc.utsname = Hadoop2
lxc.mount = /var/lib/lxc/Hadoop2/fstab
lxc.rootfs = /var/lib/lxc/Hadoop2/rootfs

I modified the lxc.network.ipv4 to the same IP that I was seeing in the
SHUTDOWN_MSG

14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
/*****************************
*******************************
*SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
<http://10.0.3.148>*
************************************************************/

Needless to say I have to make changes in the hadoop config files like
core-site.xml/hdfs-site.xml appropriately.

Regards,
Vicky


On Tue, Mar 25, 2014 at 8:07 AM, Mingjiang Shi <ms...@gopivotal.com> wrote:

> Hi Vicky,
> Do you use dhcp or assign ip address statically to the containers?
> Suggest you assign static ip address to the container instead of using
> dhcp.
>
>
>
> On Mon, Mar 24, 2014 at 11:19 PM, Vicky Kak <vi...@gmail.com> wrote:
>
>> Yep, they can see each other and the outside world.
>> My issue seems to appearing from the cached IP in
>>
>> /var/lib/misc/dnsmasq.lxcbr0.leases
>>
>>
>>
>>
>>
>>
>> On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:
>>
>>> are your linux containers networked properly (i.e. can they see each
>>> other, and the outside world, etc...)
>>> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>>>
>>>
>>> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>>>
>>>> Hi All,
>>>>
>>>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>>>> the hadoop cluster for the testing.
>>>> I have create two linux application containers which are called
>>>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>>>> with hadoop2 is 10.0.3.201.
>>>>
>>>> I am able to start the Namenode on 10.0.3.200 but when i try to start
>>>> the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>>>
>>>>
>>>> ****************************************************************************************
>>>> $ hdfs datanode
>>>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting DataNode
>>>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>>>> STARTUP_MSG:   args = []
>>>> STARTUP_MSG:   version = 2.2.0
>>>> STARTUP_MSG:   classpath =
>>>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common-r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>>>> STARTUP_MSG:   java = 1.7.0
>>>> ************************************************************/
>>>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal
>>>> handlers for [TERM, HUP, INT]
>>>> 14/03/24 09:30:57 WARN common.Util: Path
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>>> configuration files. Please update hdfs configuration.
>>>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>>>> hadoop-metrics2.properties
>>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot
>>>> period at 10 second(s).
>>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>>>> started
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>>>> 0.0.0.0:50010
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>>>> bytes/s
>>>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>> org.mortbay.log.Slf4jLog
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>>>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context datanode
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context logs
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context static
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>>>> localhost:50075
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>>>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>>>> 14/03/24 09:30:59 INFO mortbay.log: Started
>>>> SelectChannelConnector@localhost:50075
>>>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>>>> 50020
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>>>> 0.0.0.0:50020
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>>>> nameservices: null
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>>>> nameservices: <default>
>>>> 14/03/24 09:30:59 WARN common.Util: Path
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>>> configuration files. Please update hdfs configuration.
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>>>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>>>> service
>>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020:
>>>> starting
>>>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>>>> nodename 2618@Hadoop2
>>>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>>>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState
>>>> MBean
>>>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory
>>>> Tree Verification scan starting at 1395674259100 with interval 21600000
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>>>> BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block
>>>> pool BP-1489452897-10.0.3.253-1395650301038 on
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>>>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>>>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>>>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all
>>>> replicas to map: 1ms
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000 beginning handshake with NN
>>>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>>>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000
>>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>>
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>     at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>>>     at
>>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>>>     at
>>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>>>     at java.lang.Thread.run(Thread.java:722)
>>>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service
>>>> for: Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122)
>>>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>>>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>>>> BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>>>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>>>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>>>> <http://10.0.3.148>*
>>>> ************************************************************/
>>>>
>>>>
>>>> ****************************************************************************************
>>>>
>>>>
>>>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>>>
>>>>
>>>> ****************************************************************************************
>>>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>>>> datanode registration from 10.0.3.201
>>>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>>>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>>>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>>>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>>>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>>
>>>> ****************************************************************************************
>>>>
>>>> I don't know from where *10.0.3.148 *
>>>>  ip is coming yet, could be due to some lxc configurations. What can be
>>>> interpreted from the hadoop error information?
>>>>
>>>> Let me know if you need more info about my environment to provide some
>>>> insights.
>>>>
>>>> Regards,
>>>> Vicky
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Jay Vyas
>>> http://jayunit100.blogspot.com
>>>
>>
>>
>
>
> --
> Cheers
> -MJ
>

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Vicky Kak <vi...@gmail.com>.
Hi Mingjiang,


I resolved the issue partially after spending a day over it.

The issue was coming due to multiple IP's getting assigned to the
linuxcontainer when cloning the existing container. I did manually  made
changed in the container configuration file, check this

sudo cat /var/lib/lxc/Hadoop2/config
lxc.network.type = veth
lxc.network.link = lxcbr0
lxc.network.flags = up
lxc.network.hwaddr = 00:16:3e:cd:ca:09
lxc.devttydir = lxc
lxc.tty = 4
lxc.pts = 1024
lxc.arch = i686
lxc.cap.drop = sys_module mac_admin
lxc.pivotdir = lxc_putold
lxc.cgroup.memory.limit_in_bytes = 3990M
lxc.network.ipv4 = 10.0.3.148 ( It was 10.0.3.201)
lxc.utsname = Hadoop2
lxc.mount = /var/lib/lxc/Hadoop2/fstab
lxc.rootfs = /var/lib/lxc/Hadoop2/rootfs

I modified the lxc.network.ipv4 to the same IP that I was seeing in the
SHUTDOWN_MSG

14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
/*****************************
*******************************
*SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
<http://10.0.3.148>*
************************************************************/

Needless to say I have to make changes in the hadoop config files like
core-site.xml/hdfs-site.xml appropriately.

Regards,
Vicky


On Tue, Mar 25, 2014 at 8:07 AM, Mingjiang Shi <ms...@gopivotal.com> wrote:

> Hi Vicky,
> Do you use dhcp or assign ip address statically to the containers?
> Suggest you assign static ip address to the container instead of using
> dhcp.
>
>
>
> On Mon, Mar 24, 2014 at 11:19 PM, Vicky Kak <vi...@gmail.com> wrote:
>
>> Yep, they can see each other and the outside world.
>> My issue seems to appearing from the cached IP in
>>
>> /var/lib/misc/dnsmasq.lxcbr0.leases
>>
>>
>>
>>
>>
>>
>> On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:
>>
>>> are your linux containers networked properly (i.e. can they see each
>>> other, and the outside world, etc...)
>>> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>>>
>>>
>>> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>>>
>>>> Hi All,
>>>>
>>>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>>>> the hadoop cluster for the testing.
>>>> I have create two linux application containers which are called
>>>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>>>> with hadoop2 is 10.0.3.201.
>>>>
>>>> I am able to start the Namenode on 10.0.3.200 but when i try to start
>>>> the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>>>
>>>>
>>>> ****************************************************************************************
>>>> $ hdfs datanode
>>>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting DataNode
>>>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>>>> STARTUP_MSG:   args = []
>>>> STARTUP_MSG:   version = 2.2.0
>>>> STARTUP_MSG:   classpath =
>>>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common-r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>>>> STARTUP_MSG:   java = 1.7.0
>>>> ************************************************************/
>>>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal
>>>> handlers for [TERM, HUP, INT]
>>>> 14/03/24 09:30:57 WARN common.Util: Path
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>>> configuration files. Please update hdfs configuration.
>>>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>>>> hadoop-metrics2.properties
>>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot
>>>> period at 10 second(s).
>>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>>>> started
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>>>> 0.0.0.0:50010
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>>>> bytes/s
>>>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>> org.mortbay.log.Slf4jLog
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>>>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context datanode
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context logs
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context static
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>>>> localhost:50075
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>>>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>>>> 14/03/24 09:30:59 INFO mortbay.log: Started
>>>> SelectChannelConnector@localhost:50075
>>>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>>>> 50020
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>>>> 0.0.0.0:50020
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>>>> nameservices: null
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>>>> nameservices: <default>
>>>> 14/03/24 09:30:59 WARN common.Util: Path
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>>> configuration files. Please update hdfs configuration.
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>>>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>>>> service
>>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020:
>>>> starting
>>>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>>>> nodename 2618@Hadoop2
>>>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>>>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState
>>>> MBean
>>>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory
>>>> Tree Verification scan starting at 1395674259100 with interval 21600000
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>>>> BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block
>>>> pool BP-1489452897-10.0.3.253-1395650301038 on
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>>>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>>>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>>>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all
>>>> replicas to map: 1ms
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000 beginning handshake with NN
>>>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>>>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000
>>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>>
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>     at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>>>     at
>>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>>>     at
>>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>>>     at java.lang.Thread.run(Thread.java:722)
>>>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service
>>>> for: Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122)
>>>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>>>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>>>> BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>>>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>>>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>>>> <http://10.0.3.148>*
>>>> ************************************************************/
>>>>
>>>>
>>>> ****************************************************************************************
>>>>
>>>>
>>>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>>>
>>>>
>>>> ****************************************************************************************
>>>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>>>> datanode registration from 10.0.3.201
>>>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>>>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>>>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>>>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>>>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>>
>>>> ****************************************************************************************
>>>>
>>>> I don't know from where *10.0.3.148 *
>>>>  ip is coming yet, could be due to some lxc configurations. What can be
>>>> interpreted from the hadoop error information?
>>>>
>>>> Let me know if you need more info about my environment to provide some
>>>> insights.
>>>>
>>>> Regards,
>>>> Vicky
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Jay Vyas
>>> http://jayunit100.blogspot.com
>>>
>>
>>
>
>
> --
> Cheers
> -MJ
>

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Vicky Kak <vi...@gmail.com>.
Hi Mingjiang,


I resolved the issue partially after spending a day over it.

The issue was coming due to multiple IP's getting assigned to the
linuxcontainer when cloning the existing container. I did manually  made
changed in the container configuration file, check this

sudo cat /var/lib/lxc/Hadoop2/config
lxc.network.type = veth
lxc.network.link = lxcbr0
lxc.network.flags = up
lxc.network.hwaddr = 00:16:3e:cd:ca:09
lxc.devttydir = lxc
lxc.tty = 4
lxc.pts = 1024
lxc.arch = i686
lxc.cap.drop = sys_module mac_admin
lxc.pivotdir = lxc_putold
lxc.cgroup.memory.limit_in_bytes = 3990M
lxc.network.ipv4 = 10.0.3.148 ( It was 10.0.3.201)
lxc.utsname = Hadoop2
lxc.mount = /var/lib/lxc/Hadoop2/fstab
lxc.rootfs = /var/lib/lxc/Hadoop2/rootfs

I modified the lxc.network.ipv4 to the same IP that I was seeing in the
SHUTDOWN_MSG

14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
/*****************************
*******************************
*SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
<http://10.0.3.148>*
************************************************************/

Needless to say I have to make changes in the hadoop config files like
core-site.xml/hdfs-site.xml appropriately.

Regards,
Vicky


On Tue, Mar 25, 2014 at 8:07 AM, Mingjiang Shi <ms...@gopivotal.com> wrote:

> Hi Vicky,
> Do you use dhcp or assign ip address statically to the containers?
> Suggest you assign static ip address to the container instead of using
> dhcp.
>
>
>
> On Mon, Mar 24, 2014 at 11:19 PM, Vicky Kak <vi...@gmail.com> wrote:
>
>> Yep, they can see each other and the outside world.
>> My issue seems to appearing from the cached IP in
>>
>> /var/lib/misc/dnsmasq.lxcbr0.leases
>>
>>
>>
>>
>>
>>
>> On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:
>>
>>> are your linux containers networked properly (i.e. can they see each
>>> other, and the outside world, etc...)
>>> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>>>
>>>
>>> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>>>
>>>> Hi All,
>>>>
>>>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>>>> the hadoop cluster for the testing.
>>>> I have create two linux application containers which are called
>>>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>>>> with hadoop2 is 10.0.3.201.
>>>>
>>>> I am able to start the Namenode on 10.0.3.200 but when i try to start
>>>> the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>>>
>>>>
>>>> ****************************************************************************************
>>>> $ hdfs datanode
>>>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting DataNode
>>>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>>>> STARTUP_MSG:   args = []
>>>> STARTUP_MSG:   version = 2.2.0
>>>> STARTUP_MSG:   classpath =
>>>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common-r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>>>> STARTUP_MSG:   java = 1.7.0
>>>> ************************************************************/
>>>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal
>>>> handlers for [TERM, HUP, INT]
>>>> 14/03/24 09:30:57 WARN common.Util: Path
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>>> configuration files. Please update hdfs configuration.
>>>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>>>> hadoop-metrics2.properties
>>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot
>>>> period at 10 second(s).
>>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>>>> started
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>>>> 0.0.0.0:50010
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>>>> bytes/s
>>>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>>> org.mortbay.log.Slf4jLog
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>>>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context datanode
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context logs
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>>> context static
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>>>> localhost:50075
>>>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>>>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>>>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>>>> 14/03/24 09:30:59 INFO mortbay.log: Started
>>>> SelectChannelConnector@localhost:50075
>>>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>>>> 50020
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>>>> 0.0.0.0:50020
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>>>> nameservices: null
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>>>> nameservices: <default>
>>>> 14/03/24 09:30:59 WARN common.Util: Path
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>>> configuration files. Please update hdfs configuration.
>>>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>>>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>>>> service
>>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020:
>>>> starting
>>>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>>>> nodename 2618@Hadoop2
>>>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>>>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState
>>>> MBean
>>>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory
>>>> Tree Verification scan starting at 1395674259100 with interval 21600000
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>>>> BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block
>>>> pool BP-1489452897-10.0.3.253-1395650301038 on
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>>>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>>>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>>>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all
>>>> replicas to map: 1ms
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000 beginning handshake with NN
>>>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>>>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000
>>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>>
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>     at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>>>     at
>>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>>>     at
>>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>>>     at java.lang.Thread.run(Thread.java:722)
>>>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service
>>>> for: Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>>> 10.0.3.200:9000
>>>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>>> DS-1380795562-10.0.3.201-50010-1395650455122)
>>>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>>>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>>>> BP-1489452897-10.0.3.253-1395650301038
>>>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>>>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>>>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>>>> <http://10.0.3.148>*
>>>> ************************************************************/
>>>>
>>>>
>>>> ****************************************************************************************
>>>>
>>>>
>>>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>>>
>>>>
>>>> ****************************************************************************************
>>>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>>>> datanode registration from 10.0.3.201
>>>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>>>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>>>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>>>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>>>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>>> ipcPort=50020,
>>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>>     at
>>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>>     at
>>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>>     at
>>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>>
>>>> ****************************************************************************************
>>>>
>>>> I don't know from where *10.0.3.148 *
>>>>  ip is coming yet, could be due to some lxc configurations. What can be
>>>> interpreted from the hadoop error information?
>>>>
>>>> Let me know if you need more info about my environment to provide some
>>>> insights.
>>>>
>>>> Regards,
>>>> Vicky
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Jay Vyas
>>> http://jayunit100.blogspot.com
>>>
>>
>>
>
>
> --
> Cheers
> -MJ
>

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Mingjiang Shi <ms...@gopivotal.com>.
Hi Vicky,
Do you use dhcp or assign ip address statically to the containers?  Suggest
you assign static ip address to the container instead of using dhcp.



On Mon, Mar 24, 2014 at 11:19 PM, Vicky Kak <vi...@gmail.com> wrote:

> Yep, they can see each other and the outside world.
> My issue seems to appearing from the cached IP in
>
> /var/lib/misc/dnsmasq.lxcbr0.leases
>
>
>
>
>
>
> On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:
>
>> are your linux containers networked properly (i.e. can they see each
>> other, and the outside world, etc...)
>> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>>
>>
>> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>>> the hadoop cluster for the testing.
>>> I have create two linux application containers which are called
>>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>>> with hadoop2 is 10.0.3.201.
>>>
>>> I am able to start the Namenode on 10.0.3.200 but when i try to start
>>> the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>>
>>>
>>> ****************************************************************************************
>>> $ hdfs datanode
>>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting DataNode
>>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 2.2.0
>>> STARTUP_MSG:   classpath =
>>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common-r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>>> STARTUP_MSG:   java = 1.7.0
>>> ************************************************************/
>>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal
>>> handlers for [TERM, HUP, INT]
>>> 14/03/24 09:30:57 WARN common.Util: Path
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>> configuration files. Please update hdfs configuration.
>>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>>> hadoop-metrics2.properties
>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
>>> at 10 second(s).
>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>>> started
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>>> 0.0.0.0:50010
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>>> bytes/s
>>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>> org.mortbay.log.Slf4jLog
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context datanode
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context logs
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context static
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>>> localhost:50075
>>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>>> 14/03/24 09:30:59 INFO mortbay.log: Started
>>> SelectChannelConnector@localhost:50075
>>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>>> 50020
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>>> 0.0.0.0:50020
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>>> nameservices: null
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>>> nameservices: <default>
>>> 14/03/24 09:30:59 WARN common.Util: Path
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>> configuration files. Please update hdfs configuration.
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>>> service
>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
>>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>>> nodename 2618@Hadoop2
>>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState
>>> MBean
>>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory
>>> Tree Verification scan starting at 1395674259100 with interval 21600000
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>>> BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>>> BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
>>> BP-1489452897-10.0.3.253-1395650301038 on
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all
>>> replicas to map: 1ms
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000 beginning handshake with NN
>>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000
>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>     at
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>     at
>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>>     at
>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>>     at
>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>>     at java.lang.Thread.run(Thread.java:722)
>>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
>>> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122)
>>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>>> BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>>> <http://10.0.3.148>*
>>> ************************************************************/
>>>
>>>
>>> ****************************************************************************************
>>>
>>>
>>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>>
>>>
>>> ****************************************************************************************
>>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>>> datanode registration from 10.0.3.201
>>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>     at
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>     at
>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>
>>> ****************************************************************************************
>>>
>>> I don't know from where *10.0.3.148 *
>>>  ip is coming yet, could be due to some lxc configurations. What can be
>>> interpreted from the hadoop error information?
>>>
>>> Let me know if you need more info about my environment to provide some
>>> insights.
>>>
>>> Regards,
>>> Vicky
>>>
>>>
>>>
>>>
>>>
>>
>>
>> --
>> Jay Vyas
>> http://jayunit100.blogspot.com
>>
>
>


-- 
Cheers
-MJ

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Mingjiang Shi <ms...@gopivotal.com>.
Hi Vicky,
Do you use dhcp or assign ip address statically to the containers?  Suggest
you assign static ip address to the container instead of using dhcp.



On Mon, Mar 24, 2014 at 11:19 PM, Vicky Kak <vi...@gmail.com> wrote:

> Yep, they can see each other and the outside world.
> My issue seems to appearing from the cached IP in
>
> /var/lib/misc/dnsmasq.lxcbr0.leases
>
>
>
>
>
>
> On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:
>
>> are your linux containers networked properly (i.e. can they see each
>> other, and the outside world, etc...)
>> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>>
>>
>> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>>> the hadoop cluster for the testing.
>>> I have create two linux application containers which are called
>>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>>> with hadoop2 is 10.0.3.201.
>>>
>>> I am able to start the Namenode on 10.0.3.200 but when i try to start
>>> the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>>
>>>
>>> ****************************************************************************************
>>> $ hdfs datanode
>>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting DataNode
>>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 2.2.0
>>> STARTUP_MSG:   classpath =
>>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common-r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>>> STARTUP_MSG:   java = 1.7.0
>>> ************************************************************/
>>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal
>>> handlers for [TERM, HUP, INT]
>>> 14/03/24 09:30:57 WARN common.Util: Path
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>> configuration files. Please update hdfs configuration.
>>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>>> hadoop-metrics2.properties
>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
>>> at 10 second(s).
>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>>> started
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>>> 0.0.0.0:50010
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>>> bytes/s
>>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>> org.mortbay.log.Slf4jLog
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context datanode
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context logs
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context static
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>>> localhost:50075
>>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>>> 14/03/24 09:30:59 INFO mortbay.log: Started
>>> SelectChannelConnector@localhost:50075
>>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>>> 50020
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>>> 0.0.0.0:50020
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>>> nameservices: null
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>>> nameservices: <default>
>>> 14/03/24 09:30:59 WARN common.Util: Path
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>> configuration files. Please update hdfs configuration.
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>>> service
>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
>>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>>> nodename 2618@Hadoop2
>>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState
>>> MBean
>>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory
>>> Tree Verification scan starting at 1395674259100 with interval 21600000
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>>> BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>>> BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
>>> BP-1489452897-10.0.3.253-1395650301038 on
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all
>>> replicas to map: 1ms
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000 beginning handshake with NN
>>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000
>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>     at
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>     at
>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>>     at
>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>>     at
>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>>     at java.lang.Thread.run(Thread.java:722)
>>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
>>> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122)
>>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>>> BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>>> <http://10.0.3.148>*
>>> ************************************************************/
>>>
>>>
>>> ****************************************************************************************
>>>
>>>
>>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>>
>>>
>>> ****************************************************************************************
>>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>>> datanode registration from 10.0.3.201
>>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>     at
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>     at
>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>
>>> ****************************************************************************************
>>>
>>> I don't know from where *10.0.3.148 *
>>>  ip is coming yet, could be due to some lxc configurations. What can be
>>> interpreted from the hadoop error information?
>>>
>>> Let me know if you need more info about my environment to provide some
>>> insights.
>>>
>>> Regards,
>>> Vicky
>>>
>>>
>>>
>>>
>>>
>>
>>
>> --
>> Jay Vyas
>> http://jayunit100.blogspot.com
>>
>
>


-- 
Cheers
-MJ

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Mingjiang Shi <ms...@gopivotal.com>.
Hi Vicky,
Do you use dhcp or assign ip address statically to the containers?  Suggest
you assign static ip address to the container instead of using dhcp.



On Mon, Mar 24, 2014 at 11:19 PM, Vicky Kak <vi...@gmail.com> wrote:

> Yep, they can see each other and the outside world.
> My issue seems to appearing from the cached IP in
>
> /var/lib/misc/dnsmasq.lxcbr0.leases
>
>
>
>
>
>
> On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:
>
>> are your linux containers networked properly (i.e. can they see each
>> other, and the outside world, etc...)
>> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>>
>>
>> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>>> the hadoop cluster for the testing.
>>> I have create two linux application containers which are called
>>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>>> with hadoop2 is 10.0.3.201.
>>>
>>> I am able to start the Namenode on 10.0.3.200 but when i try to start
>>> the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>>
>>>
>>> ****************************************************************************************
>>> $ hdfs datanode
>>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting DataNode
>>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 2.2.0
>>> STARTUP_MSG:   classpath =
>>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common-r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>>> STARTUP_MSG:   java = 1.7.0
>>> ************************************************************/
>>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal
>>> handlers for [TERM, HUP, INT]
>>> 14/03/24 09:30:57 WARN common.Util: Path
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>> configuration files. Please update hdfs configuration.
>>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>>> hadoop-metrics2.properties
>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
>>> at 10 second(s).
>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>>> started
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>>> 0.0.0.0:50010
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>>> bytes/s
>>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>> org.mortbay.log.Slf4jLog
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context datanode
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context logs
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context static
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>>> localhost:50075
>>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>>> 14/03/24 09:30:59 INFO mortbay.log: Started
>>> SelectChannelConnector@localhost:50075
>>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>>> 50020
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>>> 0.0.0.0:50020
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>>> nameservices: null
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>>> nameservices: <default>
>>> 14/03/24 09:30:59 WARN common.Util: Path
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>> configuration files. Please update hdfs configuration.
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>>> service
>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
>>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>>> nodename 2618@Hadoop2
>>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState
>>> MBean
>>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory
>>> Tree Verification scan starting at 1395674259100 with interval 21600000
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>>> BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>>> BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
>>> BP-1489452897-10.0.3.253-1395650301038 on
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all
>>> replicas to map: 1ms
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000 beginning handshake with NN
>>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000
>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>     at
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>     at
>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>>     at
>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>>     at
>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>>     at java.lang.Thread.run(Thread.java:722)
>>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
>>> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122)
>>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>>> BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>>> <http://10.0.3.148>*
>>> ************************************************************/
>>>
>>>
>>> ****************************************************************************************
>>>
>>>
>>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>>
>>>
>>> ****************************************************************************************
>>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>>> datanode registration from 10.0.3.201
>>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>     at
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>     at
>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>
>>> ****************************************************************************************
>>>
>>> I don't know from where *10.0.3.148 *
>>>  ip is coming yet, could be due to some lxc configurations. What can be
>>> interpreted from the hadoop error information?
>>>
>>> Let me know if you need more info about my environment to provide some
>>> insights.
>>>
>>> Regards,
>>> Vicky
>>>
>>>
>>>
>>>
>>>
>>
>>
>> --
>> Jay Vyas
>> http://jayunit100.blogspot.com
>>
>
>


-- 
Cheers
-MJ

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Mingjiang Shi <ms...@gopivotal.com>.
Hi Vicky,
Do you use dhcp or assign ip address statically to the containers?  Suggest
you assign static ip address to the container instead of using dhcp.



On Mon, Mar 24, 2014 at 11:19 PM, Vicky Kak <vi...@gmail.com> wrote:

> Yep, they can see each other and the outside world.
> My issue seems to appearing from the cached IP in
>
> /var/lib/misc/dnsmasq.lxcbr0.leases
>
>
>
>
>
>
> On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:
>
>> are your linux containers networked properly (i.e. can they see each
>> other, and the outside world, etc...)
>> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>>
>>
>> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>>> the hadoop cluster for the testing.
>>> I have create two linux application containers which are called
>>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>>> with hadoop2 is 10.0.3.201.
>>>
>>> I am able to start the Namenode on 10.0.3.200 but when i try to start
>>> the DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>>
>>>
>>> ****************************************************************************************
>>> $ hdfs datanode
>>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting DataNode
>>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 2.2.0
>>> STARTUP_MSG:   classpath =
>>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common-r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>>> STARTUP_MSG:   java = 1.7.0
>>> ************************************************************/
>>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal
>>> handlers for [TERM, HUP, INT]
>>> 14/03/24 09:30:57 WARN common.Util: Path
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>> configuration files. Please update hdfs configuration.
>>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>>> hadoop-metrics2.properties
>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
>>> at 10 second(s).
>>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>>> started
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>>> 0.0.0.0:50010
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>>> bytes/s
>>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>>> org.mortbay.log.Slf4jLog
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context datanode
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context logs
>>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>>> context static
>>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>>> localhost:50075
>>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>>> 14/03/24 09:30:59 INFO mortbay.log: Started
>>> SelectChannelConnector@localhost:50075
>>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>>> 50020
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>>> 0.0.0.0:50020
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>>> nameservices: null
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>>> nameservices: <default>
>>> 14/03/24 09:30:59 WARN common.Util: Path
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>>> configuration files. Please update hdfs configuration.
>>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>>> service
>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
>>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>>> nodename 2618@Hadoop2
>>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState
>>> MBean
>>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory
>>> Tree Verification scan starting at 1395674259100 with interval 21600000
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>>> BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>>> BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
>>> BP-1489452897-10.0.3.253-1395650301038 on
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all
>>> replicas to map: 1ms
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000 beginning handshake with NN
>>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000
>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>     at
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>     at
>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>>     at
>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>>     at
>>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>     at $Proxy9.registerDatanode(Unknown Source)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>>     at
>>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>>     at java.lang.Thread.run(Thread.java:722)
>>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
>>> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /
>>> 10.0.3.200:9000
>>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>>> DS-1380795562-10.0.3.201-50010-1395650455122)
>>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>>> BP-1489452897-10.0.3.253-1395650301038
>>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>>> <http://10.0.3.148>*
>>> ************************************************************/
>>>
>>>
>>> ****************************************************************************************
>>>
>>>
>>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>>
>>>
>>> ****************************************************************************************
>>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>>> datanode registration from 10.0.3.201
>>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>>> ipcPort=50020,
>>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>>     at
>>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>>     at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>>     at
>>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>>     at
>>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>>     at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>>
>>> ****************************************************************************************
>>>
>>> I don't know from where *10.0.3.148 *
>>>  ip is coming yet, could be due to some lxc configurations. What can be
>>> interpreted from the hadoop error information?
>>>
>>> Let me know if you need more info about my environment to provide some
>>> insights.
>>>
>>> Regards,
>>> Vicky
>>>
>>>
>>>
>>>
>>>
>>
>>
>> --
>> Jay Vyas
>> http://jayunit100.blogspot.com
>>
>
>


-- 
Cheers
-MJ

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Vicky Kak <vi...@gmail.com>.
Yep, they can see each other and the outside world.
My issue seems to appearing from the cached IP in

/var/lib/misc/dnsmasq.lxcbr0.leases






On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:

> are your linux containers networked properly (i.e. can they see each
> other, and the outside world, etc...)
> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>
>
> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>> the hadoop cluster for the testing.
>> I have create two linux application containers which are called
>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>> with hadoop2 is 10.0.3.201.
>>
>> I am able to start the Namenode on 10.0.3.200 but when i try to start the
>> DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>
>>
>> ****************************************************************************************
>> $ hdfs datanode
>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting DataNode
>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>> STARTUP_MSG:   args = []
>> STARTUP_MSG:   version = 2.2.0
>> STARTUP_MSG:   classpath =
>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r
>> 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>> STARTUP_MSG:   java = 1.7.0
>> ************************************************************/
>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers
>> for [TERM, HUP, INT]
>> 14/03/24 09:30:57 WARN common.Util: Path
>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>> configuration files. Please update hdfs configuration.
>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>> hadoop-metrics2.properties
>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
>> at 10 second(s).
>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>> started
>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>> 0.0.0.0:50010
>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>> bytes/s
>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>> org.mortbay.log.Slf4jLog
>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context datanode
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context logs
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context static
>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>> localhost:50075
>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>> 14/03/24 09:30:59 INFO mortbay.log: Started
>> SelectChannelConnector@localhost:50075
>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>> 50020
>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>> 0.0.0.0:50020
>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>> nameservices: null
>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>> nameservices: <default>
>> 14/03/24 09:30:59 WARN common.Util: Path
>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>> configuration files. Please update hdfs configuration.
>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>> service
>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>> nodename 2618@Hadoop2
>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree
>> Verification scan starting at 1395674259100 with interval 21600000
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>> BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>> BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
>> BP-1489452897-10.0.3.253-1395650301038 on
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas
>> to map: 1ms
>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000beginning handshake with NN
>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>     at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>     at $Proxy9.registerDatanode(Unknown Source)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>     at $Proxy9.registerDatanode(Unknown Source)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>     at java.lang.Thread.run(Thread.java:722)
>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
>> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122)
>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>> BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>> /************************************************************
>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>> <http://10.0.3.148>*
>> ************************************************************/
>>
>>
>> ****************************************************************************************
>>
>>
>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>
>>
>> ****************************************************************************************
>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>> datanode registration from 10.0.3.201
>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>     at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>
>> ****************************************************************************************
>>
>> I don't know from where *10.0.3.148 *
>>  ip is coming yet, could be due to some lxc configurations. What can be
>> interpreted from the hadoop error information?
>>
>> Let me know if you need more info about my environment to provide some
>> insights.
>>
>> Regards,
>> Vicky
>>
>>
>>
>>
>>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com
>

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Vicky Kak <vi...@gmail.com>.
Yep, they can see each other and the outside world.
My issue seems to appearing from the cached IP in

/var/lib/misc/dnsmasq.lxcbr0.leases






On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:

> are your linux containers networked properly (i.e. can they see each
> other, and the outside world, etc...)
> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>
>
> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>> the hadoop cluster for the testing.
>> I have create two linux application containers which are called
>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>> with hadoop2 is 10.0.3.201.
>>
>> I am able to start the Namenode on 10.0.3.200 but when i try to start the
>> DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>
>>
>> ****************************************************************************************
>> $ hdfs datanode
>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting DataNode
>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>> STARTUP_MSG:   args = []
>> STARTUP_MSG:   version = 2.2.0
>> STARTUP_MSG:   classpath =
>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r
>> 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>> STARTUP_MSG:   java = 1.7.0
>> ************************************************************/
>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers
>> for [TERM, HUP, INT]
>> 14/03/24 09:30:57 WARN common.Util: Path
>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>> configuration files. Please update hdfs configuration.
>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>> hadoop-metrics2.properties
>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
>> at 10 second(s).
>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>> started
>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>> 0.0.0.0:50010
>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>> bytes/s
>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>> org.mortbay.log.Slf4jLog
>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context datanode
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context logs
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context static
>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>> localhost:50075
>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>> 14/03/24 09:30:59 INFO mortbay.log: Started
>> SelectChannelConnector@localhost:50075
>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>> 50020
>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>> 0.0.0.0:50020
>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>> nameservices: null
>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>> nameservices: <default>
>> 14/03/24 09:30:59 WARN common.Util: Path
>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>> configuration files. Please update hdfs configuration.
>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>> service
>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>> nodename 2618@Hadoop2
>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree
>> Verification scan starting at 1395674259100 with interval 21600000
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>> BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>> BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
>> BP-1489452897-10.0.3.253-1395650301038 on
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas
>> to map: 1ms
>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000beginning handshake with NN
>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>     at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>     at $Proxy9.registerDatanode(Unknown Source)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>     at $Proxy9.registerDatanode(Unknown Source)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>     at java.lang.Thread.run(Thread.java:722)
>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
>> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122)
>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>> BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>> /************************************************************
>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>> <http://10.0.3.148>*
>> ************************************************************/
>>
>>
>> ****************************************************************************************
>>
>>
>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>
>>
>> ****************************************************************************************
>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>> datanode registration from 10.0.3.201
>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>     at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>
>> ****************************************************************************************
>>
>> I don't know from where *10.0.3.148 *
>>  ip is coming yet, could be due to some lxc configurations. What can be
>> interpreted from the hadoop error information?
>>
>> Let me know if you need more info about my environment to provide some
>> insights.
>>
>> Regards,
>> Vicky
>>
>>
>>
>>
>>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com
>

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Vicky Kak <vi...@gmail.com>.
Yep, they can see each other and the outside world.
My issue seems to appearing from the cached IP in

/var/lib/misc/dnsmasq.lxcbr0.leases






On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:

> are your linux containers networked properly (i.e. can they see each
> other, and the outside world, etc...)
> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>
>
> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>> the hadoop cluster for the testing.
>> I have create two linux application containers which are called
>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>> with hadoop2 is 10.0.3.201.
>>
>> I am able to start the Namenode on 10.0.3.200 but when i try to start the
>> DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>
>>
>> ****************************************************************************************
>> $ hdfs datanode
>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting DataNode
>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>> STARTUP_MSG:   args = []
>> STARTUP_MSG:   version = 2.2.0
>> STARTUP_MSG:   classpath =
>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r
>> 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>> STARTUP_MSG:   java = 1.7.0
>> ************************************************************/
>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers
>> for [TERM, HUP, INT]
>> 14/03/24 09:30:57 WARN common.Util: Path
>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>> configuration files. Please update hdfs configuration.
>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>> hadoop-metrics2.properties
>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
>> at 10 second(s).
>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>> started
>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>> 0.0.0.0:50010
>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>> bytes/s
>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>> org.mortbay.log.Slf4jLog
>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context datanode
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context logs
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context static
>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>> localhost:50075
>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>> 14/03/24 09:30:59 INFO mortbay.log: Started
>> SelectChannelConnector@localhost:50075
>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>> 50020
>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>> 0.0.0.0:50020
>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>> nameservices: null
>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>> nameservices: <default>
>> 14/03/24 09:30:59 WARN common.Util: Path
>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>> configuration files. Please update hdfs configuration.
>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>> service
>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>> nodename 2618@Hadoop2
>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree
>> Verification scan starting at 1395674259100 with interval 21600000
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>> BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>> BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
>> BP-1489452897-10.0.3.253-1395650301038 on
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas
>> to map: 1ms
>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000beginning handshake with NN
>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>     at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>     at $Proxy9.registerDatanode(Unknown Source)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>     at $Proxy9.registerDatanode(Unknown Source)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>     at java.lang.Thread.run(Thread.java:722)
>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
>> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122)
>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>> BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>> /************************************************************
>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>> <http://10.0.3.148>*
>> ************************************************************/
>>
>>
>> ****************************************************************************************
>>
>>
>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>
>>
>> ****************************************************************************************
>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>> datanode registration from 10.0.3.201
>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>     at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>
>> ****************************************************************************************
>>
>> I don't know from where *10.0.3.148 *
>>  ip is coming yet, could be due to some lxc configurations. What can be
>> interpreted from the hadoop error information?
>>
>> Let me know if you need more info about my environment to provide some
>> insights.
>>
>> Regards,
>> Vicky
>>
>>
>>
>>
>>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com
>

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Vicky Kak <vi...@gmail.com>.
Yep, they can see each other and the outside world.
My issue seems to appearing from the cached IP in

/var/lib/misc/dnsmasq.lxcbr0.leases






On Mon, Mar 24, 2014 at 7:50 PM, Jay Vyas <ja...@gmail.com> wrote:

> are your linux containers networked properly (i.e. can they see each
> other, and the outside world, etc...)
> www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/
>
>
> On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
>> the hadoop cluster for the testing.
>> I have create two linux application containers which are called
>> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
>> with hadoop2 is 10.0.3.201.
>>
>> I am able to start the Namenode on 10.0.3.200 but when i try to start the
>> DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>>
>>
>> ****************************************************************************************
>> $ hdfs datanode
>> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting DataNode
>> STARTUP_MSG:   host = Hadoop2/10.0.3.148
>> STARTUP_MSG:   args = []
>> STARTUP_MSG:   version = 2.2.0
>> STARTUP_MSG:   classpath =
>> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r
>> 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
>> STARTUP_MSG:   java = 1.7.0
>> ************************************************************/
>> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers
>> for [TERM, HUP, INT]
>> 14/03/24 09:30:57 WARN common.Util: Path
>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>> configuration files. Please update hdfs configuration.
>> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
>> hadoop-metrics2.properties
>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
>> at 10 second(s).
>> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
>> started
>> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
>> 0.0.0.0:50010
>> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
>> bytes/s
>> 14/03/24 09:30:58 INFO mortbay.log: Logging to
>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
>> org.mortbay.log.Slf4jLog
>> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
>> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context datanode
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context logs
>> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
>> context static
>> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
>> localhost:50075
>> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
>> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
>> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
>> 14/03/24 09:30:59 INFO mortbay.log: Started
>> SelectChannelConnector@localhost:50075
>> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port
>> 50020
>> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
>> 0.0.0.0:50020
>> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
>> nameservices: null
>> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
>> nameservices: <default>
>> 14/03/24 09:30:59 WARN common.Util: Path
>> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
>> configuration files. Please update hdfs configuration.
>> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
>> (storage id unknown) service to /10.0.3.200:9000 starting to offer
>> service
>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
>> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
>> 14/03/24 09:30:59 INFO common.Storage: Lock on
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
>> nodename 2618@Hadoop2
>> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
>> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
>> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
>> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree
>> Verification scan starting at 1395674259100 with interval 21600000
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
>> BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
>> BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
>> BP-1489452897-10.0.3.253-1395650301038 on
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all
>> replicas for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
>> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map
>> for block pool BP-1489452897-10.0.3.253-1395650301038 on volume
>> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas
>> to map: 1ms
>> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000beginning handshake with NN
>> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for
>> block pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>     at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>     at $Proxy9.registerDatanode(Unknown Source)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:601)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>     at $Proxy9.registerDatanode(Unknown Source)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>     at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>     at java.lang.Thread.run(Thread.java:722)
>> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
>> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
>> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
>> BP-1489452897-10.0.3.253-1395650301038 (storage id
>> DS-1380795562-10.0.3.201-50010-1395650455122)
>> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
>> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
>> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
>> BP-1489452897-10.0.3.253-1395650301038
>> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
>> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
>> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
>> /************************************************************
>> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
>> <http://10.0.3.148>*
>> ************************************************************/
>>
>>
>> ****************************************************************************************
>>
>>
>> And here is the corresponding error coming at NameNode( 10.0.3.201)
>>
>>
>> ****************************************************************************************
>> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
>> datanode registration from 10.0.3.201
>> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:ubuntu (auth:SIMPLE)
>> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
>> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
>> from 10.0.3.201:60951 Call#1 Retry#0: error:
>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
>> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
>> ipcPort=50020,
>> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>>     at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>>
>> ****************************************************************************************
>>
>> I don't know from where *10.0.3.148 *
>>  ip is coming yet, could be due to some lxc configurations. What can be
>> interpreted from the hadoop error information?
>>
>> Let me know if you need more info about my environment to provide some
>> insights.
>>
>> Regards,
>> Vicky
>>
>>
>>
>>
>>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com
>

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Jay Vyas <ja...@gmail.com>.
are your linux containers networked properly (i.e. can they see each other,
and the outside world, etc...)
www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/


On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:

> Hi All,
>
> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
> the hadoop cluster for the testing.
> I have create two linux application containers which are called
> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
> with hadoop2 is 10.0.3.201.
>
> I am able to start the Namenode on 10.0.3.200 but when i try to start the
> DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>
>
> ****************************************************************************************
> $ hdfs datanode
> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = Hadoop2/10.0.3.148
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 2.2.0
> STARTUP_MSG:   classpath =
> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r
> 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
> STARTUP_MSG:   java = 1.7.0
> ************************************************************/
> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers
> for [TERM, HUP, INT]
> 14/03/24 09:30:57 WARN common.Util: Path
> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
> configuration files. Please update hdfs configuration.
> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties
> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
> at 10 second(s).
> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
> started
> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
> 0.0.0.0:50010
> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
> bytes/s
> 14/03/24 09:30:58 INFO mortbay.log: Logging to
> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> org.mortbay.log.Slf4jLog
> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context datanode
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context logs
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context static
> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
> localhost:50075
> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
> 14/03/24 09:30:59 INFO mortbay.log: Started
> SelectChannelConnector@localhost:50075
> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port 50020
> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
> 0.0.0.0:50020
> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
> nameservices: null
> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
> nameservices: <default>
> 14/03/24 09:30:59 WARN common.Util: Path
> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
> configuration files. Please update hdfs configuration.
> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
> (storage id unknown) service to /10.0.3.200:9000 starting to offer service
> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
> 14/03/24 09:30:59 INFO common.Storage: Lock on
> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
> nodename 2618@Hadoop2
> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree
> Verification scan starting at 1395674259100 with interval 21600000
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
> BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
> BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
> BP-1489452897-10.0.3.253-1395650301038 on
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all replicas
> for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map for
> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas
> to map: 1ms
> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
> BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000beginning handshake with NN
> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for block
> pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>     at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>     at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>     at $Proxy9.registerDatanode(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:601)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>     at $Proxy9.registerDatanode(Unknown Source)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>     at java.lang.Thread.run(Thread.java:722)
> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
> BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122)
> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
> BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
> <http://10.0.3.148>*
> ************************************************************/
>
>
> ****************************************************************************************
>
>
> And here is the corresponding error coming at NameNode( 10.0.3.201)
>
>
> ****************************************************************************************
> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
> datanode registration from 10.0.3.201
> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
> PriviledgedActionException as:ubuntu (auth:SIMPLE)
> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
> from 10.0.3.201:60951 Call#1 Retry#0: error:
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>     at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>     at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
> ****************************************************************************************
>
> I don't know from where *10.0.3.148 *
>  ip is coming yet, could be due to some lxc configurations. What can be
> interpreted from the hadoop error information?
>
> Let me know if you need more info about my environment to provide some
> insights.
>
> Regards,
> Vicky
>
>
>
>
>


-- 
Jay Vyas
http://jayunit100.blogspot.com

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Jay Vyas <ja...@gmail.com>.
are your linux containers networked properly (i.e. can they see each other,
and the outside world, etc...)
www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/


On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:

> Hi All,
>
> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
> the hadoop cluster for the testing.
> I have create two linux application containers which are called
> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
> with hadoop2 is 10.0.3.201.
>
> I am able to start the Namenode on 10.0.3.200 but when i try to start the
> DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>
>
> ****************************************************************************************
> $ hdfs datanode
> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = Hadoop2/10.0.3.148
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 2.2.0
> STARTUP_MSG:   classpath =
> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r
> 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
> STARTUP_MSG:   java = 1.7.0
> ************************************************************/
> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers
> for [TERM, HUP, INT]
> 14/03/24 09:30:57 WARN common.Util: Path
> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
> configuration files. Please update hdfs configuration.
> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties
> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
> at 10 second(s).
> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
> started
> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
> 0.0.0.0:50010
> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
> bytes/s
> 14/03/24 09:30:58 INFO mortbay.log: Logging to
> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> org.mortbay.log.Slf4jLog
> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context datanode
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context logs
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context static
> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
> localhost:50075
> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
> 14/03/24 09:30:59 INFO mortbay.log: Started
> SelectChannelConnector@localhost:50075
> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port 50020
> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
> 0.0.0.0:50020
> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
> nameservices: null
> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
> nameservices: <default>
> 14/03/24 09:30:59 WARN common.Util: Path
> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
> configuration files. Please update hdfs configuration.
> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
> (storage id unknown) service to /10.0.3.200:9000 starting to offer service
> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
> 14/03/24 09:30:59 INFO common.Storage: Lock on
> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
> nodename 2618@Hadoop2
> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree
> Verification scan starting at 1395674259100 with interval 21600000
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
> BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
> BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
> BP-1489452897-10.0.3.253-1395650301038 on
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all replicas
> for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map for
> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas
> to map: 1ms
> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
> BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000beginning handshake with NN
> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for block
> pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>     at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>     at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>     at $Proxy9.registerDatanode(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:601)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>     at $Proxy9.registerDatanode(Unknown Source)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>     at java.lang.Thread.run(Thread.java:722)
> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
> BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122)
> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
> BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
> <http://10.0.3.148>*
> ************************************************************/
>
>
> ****************************************************************************************
>
>
> And here is the corresponding error coming at NameNode( 10.0.3.201)
>
>
> ****************************************************************************************
> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
> datanode registration from 10.0.3.201
> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
> PriviledgedActionException as:ubuntu (auth:SIMPLE)
> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
> from 10.0.3.201:60951 Call#1 Retry#0: error:
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>     at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>     at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
> ****************************************************************************************
>
> I don't know from where *10.0.3.148 *
>  ip is coming yet, could be due to some lxc configurations. What can be
> interpreted from the hadoop error information?
>
> Let me know if you need more info about my environment to provide some
> insights.
>
> Regards,
> Vicky
>
>
>
>
>


-- 
Jay Vyas
http://jayunit100.blogspot.com

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Jay Vyas <ja...@gmail.com>.
are your linux containers networked properly (i.e. can they see each other,
and the outside world, etc...)
www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/


On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:

> Hi All,
>
> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
> the hadoop cluster for the testing.
> I have create two linux application containers which are called
> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
> with hadoop2 is 10.0.3.201.
>
> I am able to start the Namenode on 10.0.3.200 but when i try to start the
> DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>
>
> ****************************************************************************************
> $ hdfs datanode
> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = Hadoop2/10.0.3.148
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 2.2.0
> STARTUP_MSG:   classpath =
> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r
> 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
> STARTUP_MSG:   java = 1.7.0
> ************************************************************/
> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers
> for [TERM, HUP, INT]
> 14/03/24 09:30:57 WARN common.Util: Path
> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
> configuration files. Please update hdfs configuration.
> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties
> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
> at 10 second(s).
> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
> started
> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
> 0.0.0.0:50010
> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
> bytes/s
> 14/03/24 09:30:58 INFO mortbay.log: Logging to
> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> org.mortbay.log.Slf4jLog
> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context datanode
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context logs
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context static
> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
> localhost:50075
> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
> 14/03/24 09:30:59 INFO mortbay.log: Started
> SelectChannelConnector@localhost:50075
> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port 50020
> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
> 0.0.0.0:50020
> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
> nameservices: null
> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
> nameservices: <default>
> 14/03/24 09:30:59 WARN common.Util: Path
> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
> configuration files. Please update hdfs configuration.
> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
> (storage id unknown) service to /10.0.3.200:9000 starting to offer service
> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
> 14/03/24 09:30:59 INFO common.Storage: Lock on
> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
> nodename 2618@Hadoop2
> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree
> Verification scan starting at 1395674259100 with interval 21600000
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
> BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
> BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
> BP-1489452897-10.0.3.253-1395650301038 on
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all replicas
> for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map for
> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas
> to map: 1ms
> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
> BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000beginning handshake with NN
> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for block
> pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>     at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>     at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>     at $Proxy9.registerDatanode(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:601)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>     at $Proxy9.registerDatanode(Unknown Source)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>     at java.lang.Thread.run(Thread.java:722)
> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
> BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122)
> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
> BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
> <http://10.0.3.148>*
> ************************************************************/
>
>
> ****************************************************************************************
>
>
> And here is the corresponding error coming at NameNode( 10.0.3.201)
>
>
> ****************************************************************************************
> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
> datanode registration from 10.0.3.201
> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
> PriviledgedActionException as:ubuntu (auth:SIMPLE)
> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
> from 10.0.3.201:60951 Call#1 Retry#0: error:
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>     at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>     at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
> ****************************************************************************************
>
> I don't know from where *10.0.3.148 *
>  ip is coming yet, could be due to some lxc configurations. What can be
> interpreted from the hadoop error information?
>
> Let me know if you need more info about my environment to provide some
> insights.
>
> Regards,
> Vicky
>
>
>
>
>


-- 
Jay Vyas
http://jayunit100.blogspot.com

Re: Setting Hadoop on LinuxContainers Fails.

Posted by Jay Vyas <ja...@gmail.com>.
are your linux containers networked properly (i.e. can they see each other,
and the outside world, etc...)
www.linux.org/threads/linux-containers-part-4-getting-to-the-universe-ping-google-com.4428/


On Mon, Mar 24, 2014 at 6:02 AM, Vicky Kak <vi...@gmail.com> wrote:

> Hi All,
>
> I am using linuxcontainer(http://linuxcontainers.org/) for configuring
> the hadoop cluster for the testing.
> I have create two linux application containers which are called
> hadoop1/hadoop2. The IP's associated with the hadoop1 is 10.0.3.200 and
> with hadoop2 is 10.0.3.201.
>
> I am able to start the Namenode on 10.0.3.200 but when i try to start the
> DataNode on 10.0.3.201 I see the following error at 10.0.3.201
>
>
> ****************************************************************************************
> $ hdfs datanode
> 14/03/24 09:30:57 INFO datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = Hadoop2/10.0.3.148
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 2.2.0
> STARTUP_MSG:   classpath =
> /home/ubuntu/Installed/hadoop-2.2.0/etc/hadoop:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/home/ubuntu/Installed/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/home/ubuntu/Installed/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r
> 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
> STARTUP_MSG:   java = 1.7.0
> ************************************************************/
> 14/03/24 09:30:57 INFO datanode.DataNode: registered UNIX signal handlers
> for [TERM, HUP, INT]
> 14/03/24 09:30:57 WARN common.Util: Path
> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
> configuration files. Please update hdfs configuration.
> 14/03/24 09:30:58 INFO impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties
> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: Scheduled snapshot period
> at 10 second(s).
> 14/03/24 09:30:58 INFO impl.MetricsSystemImpl: DataNode metrics system
> started
> 14/03/24 09:30:58 INFO datanode.DataNode: Configured hostname is Hadoop2
> 14/03/24 09:30:58 INFO datanode.DataNode: Opened streaming server at /
> 0.0.0.0:50010
> 14/03/24 09:30:58 INFO datanode.DataNode: Balancing bandwith is 1048576
> bytes/s
> 14/03/24 09:30:58 INFO mortbay.log: Logging to
> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> org.mortbay.log.Slf4jLog
> 14/03/24 09:30:58 INFO http.HttpServer: Added global filter 'safety'
> (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context datanode
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context logs
> 14/03/24 09:30:58 INFO http.HttpServer: Added filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context static
> 14/03/24 09:30:58 INFO datanode.DataNode: Opened info server at
> localhost:50075
> 14/03/24 09:30:58 INFO datanode.DataNode: dfs.webhdfs.enabled = false
> 14/03/24 09:30:58 INFO http.HttpServer: Jetty bound to port 50075
> 14/03/24 09:30:58 INFO mortbay.log: jetty-6.1.26
> 14/03/24 09:30:59 INFO mortbay.log: Started
> SelectChannelConnector@localhost:50075
> 14/03/24 09:30:59 INFO ipc.Server: Starting Socket Reader #1 for port 50020
> 14/03/24 09:30:59 INFO datanode.DataNode: Opened IPC server at /
> 0.0.0.0:50020
> 14/03/24 09:30:59 INFO datanode.DataNode: Refresh request received for
> nameservices: null
> 14/03/24 09:30:59 INFO datanode.DataNode: Starting BPOfferServices for
> nameservices: <default>
> 14/03/24 09:30:59 WARN common.Util: Path
> /home/ubuntu/dallaybatta-data/hdfs/datanode should be specified as a URI in
> configuration files. Please update hdfs configuration.
> 14/03/24 09:30:59 INFO datanode.DataNode: Block pool <registering>
> (storage id unknown) service to /10.0.3.200:9000 starting to offer service
> 14/03/24 09:30:59 INFO ipc.Server: IPC Server Responder: starting
> 14/03/24 09:30:59 INFO ipc.Server: IPC Server listener on 50020: starting
> 14/03/24 09:30:59 INFO common.Storage: Lock on
> /home/ubuntu/dallaybatta-data/hdfs/datanode/in_use.lock acquired by
> nodename 2618@Hadoop2
> 14/03/24 09:31:00 INFO common.Storage: Locking is disabled
> 14/03/24 09:31:00 INFO datanode.DataNode: Setting up storage:
> nsid=1367523242;bpid=BP-1489452897-10.0.3.253-1395650301038;lv=-47;nsInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0;bpid=BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Added volume -
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
> 14/03/24 09:31:00 INFO datanode.DirectoryScanner: Periodic Directory Tree
> Verification scan starting at 1395674259100 with interval 21600000
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding block pool
> BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Scanning block pool
> BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time taken to scan block pool
> BP-1489452897-10.0.3.253-1395650301038 on
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 11ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to scan all replicas
> for block pool BP-1489452897-10.0.3.253-1395650301038: 13ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Adding replicas to map for
> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current...
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Time to add replicas to map for
> block pool BP-1489452897-10.0.3.253-1395650301038 on volume
> /home/ubuntu/dallaybatta-data/hdfs/datanode/current: 0ms
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Total time to add all replicas
> to map: 1ms
> 14/03/24 09:31:00 INFO datanode.DataNode: Block pool
> BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000beginning handshake with NN
> 14/03/24 09:31:00 FATAL datanode.DataNode: Initialization failed for block
> pool Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>     at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>     at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:1347)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>     at $Proxy9.registerDatanode(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:601)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>     at $Proxy9.registerDatanode(Unknown Source)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>     at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>     at java.lang.Thread.run(Thread.java:722)
> 14/03/24 09:31:00 WARN datanode.DataNode: Ending block pool service for:
> Block pool BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122) service to /10.0.3.200:9000
> 14/03/24 09:31:00 INFO datanode.DataNode: Removed Block pool
> BP-1489452897-10.0.3.253-1395650301038 (storage id
> DS-1380795562-10.0.3.201-50010-1395650455122)
> 14/03/24 09:31:00 INFO datanode.DataBlockScanner: Removed
> bpid=BP-1489452897-10.0.3.253-1395650301038 from blockPoolScannerMap
> 14/03/24 09:31:00 INFO impl.FsDatasetImpl: Removing block pool
> BP-1489452897-10.0.3.253-1395650301038
> 14/03/24 09:31:02 WARN datanode.DataNode: Exiting Datanode
> 14/03/24 09:31:02 INFO util.ExitUtil: Exiting with status 0
> 14/03/24 09:31:02 INFO datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> *SHUTDOWN_MSG: Shutting down DataNode at Hadoop2/10.0.3.148
> <http://10.0.3.148>*
> ************************************************************/
>
>
> ****************************************************************************************
>
>
> And here is the corresponding error coming at NameNode( 10.0.3.201)
>
>
> ****************************************************************************************
> 14/03/24 09:31:00 WARN blockmanagement.DatanodeManager: Unresolved
> datanode registration from 10.0.3.201
> 14/03/24 09:31:00 ERROR security.UserGroupInformation:
> PriviledgedActionException as:ubuntu (auth:SIMPLE)
> cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
> 14/03/24 09:31:00 INFO ipc.Server: IPC Server handler 3 on 9000, call
> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode
> from 10.0.3.201:60951 Call#1 Retry#0: error:
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0,
> storageID=DS-1380795562-10.0.3.201-50010-1395650455122, infoPort=50075,
> ipcPort=50020,
> storageInfo=lv=-47;cid=CID-b9e031fa-ebeb-4d52-9ead-4e65f49246ce;nsid=1367523242;c=0)
>     at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3929)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
>     at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>     at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
>
> ****************************************************************************************
>
> I don't know from where *10.0.3.148 *
>  ip is coming yet, could be due to some lxc configurations. What can be
> interpreted from the hadoop error information?
>
> Let me know if you need more info about my environment to provide some
> insights.
>
> Regards,
> Vicky
>
>
>
>
>


-- 
Jay Vyas
http://jayunit100.blogspot.com