You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Sujit Dhamale <su...@gmail.com> on 2012/04/06 20:13:21 UTC

Data Node is not Started

Hi all,
my DataNode is not started .

even after deleting hadoop*.pid file from /tmp , But still Data node is not
started ,


Hadoop Version: hadoop-1.0.1.tar.gz
Java version : java version "1.6.0_26
Operating System : Ubuntu 11.10


i did below procedure


*hduser@sujit:~/Desktop/hadoop/bin$ jps*
11455 Jps


*hduser@sujit:~/Desktop/hadoop/bin$ start-all.sh*
Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-namenode-sujit.out
localhost: starting datanode, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-datanode-sujit.out
localhost: starting secondarynamenode, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-sujit.out
starting jobtracker, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-jobtracker-sujit.out
localhost: starting tasktracker, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-tasktracker-sujit.out

*hduser@sujit:~/Desktop/hadoop/bin$ jps*
11528 NameNode
12019 SecondaryNameNode
12355 TaskTracker
12115 JobTracker
12437 Jps


*hduser@sujit:~/Desktop/hadoop/bin$ stop-all.sh*
Warning: $HADOOP_HOME is deprecated.

stopping jobtracker
localhost: stopping tasktracker
stopping namenode
localhost: no datanode to stop
localhost: stopping secondarynamenode


*hduser@sujit:~/Desktop/hadoop/bin$ jps*
13127 Jps


*hduser@sujit:~/Desktop/hadoop/bin$ ls /tmp*
hadoop-hduser-datanode.pid
hsperfdata_hduser                        keyring-meecr7
ssh-JXYCAJsX1324
hadoop-hduser-jobtracker.pid
hsperfdata_sujit                         plugtmp
unity_support_test.0
hadoop-hduser-namenode.pid
Jetty_0_0_0_0_50030_job____yn7qmk        pulse-2L9K88eMlGn7
virtual-hduser.Q8j5nJ
hadoop-hduser-secondarynamenode.pid
Jetty_0_0_0_0_50070_hdfs____w2cu08       pulse-Ob9vyJcXyHZz
hadoop-hduser-tasktracker.pid
Jetty_0_0_0_0_50090_secondary____y6aanv  pulse-PKdhtXMmr18n

*Deleted *.pid file :)

hduser@sujit:~$ ls /tmp*
hsperfdata_hduser                        pulse-2L9K88eMlGn7
hsperfdata_sujit                         pulse-Ob9vyJcXyHZz
Jetty_0_0_0_0_50030_job____yn7qmk        pulse-PKdhtXMmr18n
Jetty_0_0_0_0_50070_hdfs____w2cu08       ssh-JXYCAJsX1324
Jetty_0_0_0_0_50090_secondary____y6aanv  unity_support_test.0
keyring-meecr7                           virtual-hduser.Q8j5nJ
plugtmp





*hduser@sujit:~/Desktop/hadoop$ bin/hadoop namenode -format*
Warning: $HADOOP_HOME is deprecated.

12/04/06 23:23:22 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = sujit.(null)/127.0.1.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
************************************************************/
Re-format filesystem in /app/hadoop/tmp/dfs/name ? (Y or N) Y
12/04/06 23:23:25 INFO util.GSet: VM type       = 32-bit
12/04/06 23:23:25 INFO util.GSet: 2% max memory = 17.77875 MB
12/04/06 23:23:25 INFO util.GSet: capacity      = 2^22 = 4194304 entries
12/04/06 23:23:25 INFO util.GSet: recommended=4194304, actual=4194304
12/04/06 23:23:25 INFO namenode.FSNamesystem: fsOwner=hduser
12/04/06 23:23:25 INFO namenode.FSNamesystem: supergroup=supergroup
12/04/06 23:23:25 INFO namenode.FSNamesystem: isPermissionEnabled=true
12/04/06 23:23:25 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
12/04/06 23:23:25 INFO namenode.FSNamesystem: isAccessTokenEnabled=false
accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
12/04/06 23:23:25 INFO namenode.NameNode: Caching file names occuring more
than 10 times
12/04/06 23:23:26 INFO common.Storage: Image file of size 112 saved in 0
seconds.
12/04/06 23:23:26 INFO common.Storage: Storage directory
/app/hadoop/tmp/dfs/name has been successfully formatted.
12/04/06 23:23:26 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at sujit.(null)/127.0.1.1
************************************************************/
hduser@sujit:~/Desktop/hadoop$ bin/start-all.sh
Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-namenode-sujit.out
localhost: starting datanode, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-datanode-sujit.out
localhost: starting secondarynamenode, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-sujit.out
starting jobtracker, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-jobtracker-sujit.out
localhost: starting tasktracker, logging to
/home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-tasktracker-sujit.out


*hduser@sujit:~/Desktop/hadoop$ jps*
14157 JobTracker
14492 Jps
14397 TaskTracker
14063 SecondaryNameNode
13574 NameNode
hduser@sujit:~/Desktop/hadoop$

Re: Data Node is not Started

Posted by Arpit Gupta <ar...@hortonworks.com>.
according to the logs the namespace id in the datanode data directories is incompatible.

Since you formatted the namenode these id's do not match. Clean up the contents of the data dir (/app/hadoop/tmp/dfs/data) and then start the datanode.

--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/

On Apr 6, 2012, at 11:27 AM, Sujit Dhamale wrote:

> Below are DataNode logs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hduser@sujit:~/Desktop/hadoop/logs$ cat hadoop-hduser-datanode-sujit.log
> 2012-04-06 22:11:34,566 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = sujit.(null)/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
> ************************************************************/
> 2012-04-06 22:11:34,749 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
> loaded properties from hadoop-metrics2.properties
> 2012-04-06 22:11:34,768 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2012-04-06 22:11:34,769 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2012-04-06 22:11:34,769 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2012-04-06 22:11:34,950 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2012-04-06 22:11:34,956 WARN
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
> exists!
> 2012-04-06 22:11:36,149 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
> 2012-04-06 22:11:41,923 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
> Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
> = 387652554; datanode namespaceID = 1269725409
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)
> 
> 2012-04-06 22:11:41,924 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
> ************************************************************/
> 2012-04-06 22:30:04,713 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = sujit.(null)/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
> ************************************************************/
> 2012-04-06 22:30:04,870 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
> loaded properties from hadoop-metrics2.properties
> 2012-04-06 22:30:04,883 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2012-04-06 22:30:04,884 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2012-04-06 22:30:04,884 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2012-04-06 22:30:05,046 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2012-04-06 22:30:05,051 WARN
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
> exists!
> 2012-04-06 22:30:06,273 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
> 2012-04-06 22:30:11,073 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
> Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
> = 361514980; datanode namespaceID = 1269725409
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)
> 
> 2012-04-06 22:30:11,075 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
> ************************************************************/
> 2012-04-06 22:36:46,946 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = sujit.(null)/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
> ************************************************************/
> 2012-04-06 22:36:47,113 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
> loaded properties from hadoop-metrics2.properties
> 2012-04-06 22:36:47,125 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2012-04-06 22:36:47,126 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2012-04-06 22:36:47,126 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2012-04-06 22:36:47,284 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2012-04-06 22:36:47,290 WARN
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
> exists!
> 2012-04-06 22:36:48,501 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
> 2012-04-06 22:36:53,401 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
> Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
> = 361514980; datanode namespaceID = 1269725409
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)
> 
> 2012-04-06 22:36:53,403 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
> ************************************************************/
> 2012-04-06 22:47:06,904 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = sujit.(null)/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
> ************************************************************/
> 2012-04-06 22:47:07,060 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
> loaded properties from hadoop-metrics2.properties
> 2012-04-06 22:47:07,072 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2012-04-06 22:47:07,073 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2012-04-06 22:47:07,073 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2012-04-06 22:47:07,288 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2012-04-06 22:47:07,299 WARN
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
> exists!
> 2012-04-06 22:47:12,826 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
> Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
> = 361514980; datanode namespaceID = 1269725409
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)
> 
> 2012-04-06 22:47:12,828 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
> ************************************************************/
> 2012-04-06 23:11:15,062 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = sujit.(null)/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
> ************************************************************/
> 2012-04-06 23:11:15,220 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
> loaded properties from hadoop-metrics2.properties
> 2012-04-06 23:11:15,234 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2012-04-06 23:11:15,235 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2012-04-06 23:11:15,235 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2012-04-06 23:11:15,402 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2012-04-06 23:11:15,407 WARN
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
> exists!
> 2012-04-06 23:11:15,713 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
> Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
> = 361514980; datanode namespaceID = 1269725409
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)
> 
> 2012-04-06 23:11:15,715 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
> ************************************************************/
> 2012-04-06 23:12:07,345 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = sujit.(null)/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
> ************************************************************/
> 2012-04-06 23:12:07,511 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
> loaded properties from hadoop-metrics2.properties
> 2012-04-06 23:12:07,523 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2012-04-06 23:12:07,524 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2012-04-06 23:12:07,524 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2012-04-06 23:12:07,706 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2012-04-06 23:12:07,712 WARN
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
> exists!
> 2012-04-06 23:12:08,900 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
> 2012-04-06 23:12:13,765 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
> Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
> = 361514980; datanode namespaceID = 1269725409
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)
> 
> 2012-04-06 23:12:13,767 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
> ************************************************************/
> 2012-04-06 23:23:46,591 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = sujit.(null)/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
> ************************************************************/
> 2012-04-06 23:23:46,747 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
> loaded properties from hadoop-metrics2.properties
> 2012-04-06 23:23:46,759 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2012-04-06 23:23:46,760 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2012-04-06 23:23:46,760 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2012-04-06 23:23:46,913 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2012-04-06 23:23:46,919 WARN
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
> exists!
> 2012-04-06 23:23:48,136 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
> 2012-04-06 23:23:53,122 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
> Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
> = 1320262146; datanode namespaceID = 1269725409
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)
> 
> 2012-04-06 23:23:53,124 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
> ************************************************************/
> hduser@sujit:~/Desktop/hadoop/logs$ A
> 
> 
> 
> 
> 2 at 11:46 PM, Prashant Kommireddi <pr...@gmail.com> wrote:
> 
>> Can you check the datanode logs? May  its an incompatible namespace issue.
>> 
>> On Apr 6, 2012, at 11:13 AM, Sujit Dhamale <su...@gmail.com>
>> wrote:
>> 
>>> Hi all,
>>> my DataNode is not started .
>>> 
>>> even after deleting hadoop*.pid file from /tmp , But still Data node is
>> not
>>> started ,
>>> 
>>> 
>>> Hadoop Version: hadoop-1.0.1.tar.gz
>>> Java version : java version "1.6.0_26
>>> Operating System : Ubuntu 11.10
>>> 
>>> 
>>> i did below procedure
>>> 
>>> 
>>> *hduser@sujit:~/Desktop/hadoop/bin$ jps*
>>> 11455 Jps
>>> 
>>> 
>>> *hduser@sujit:~/Desktop/hadoop/bin$ start-all.sh*
>>> Warning: $HADOOP_HOME is deprecated.
>>> 
>>> starting namenode, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-namenode-sujit.out
>>> localhost: starting datanode, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-datanode-sujit.out
>>> localhost: starting secondarynamenode, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-sujit.out
>>> starting jobtracker, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-jobtracker-sujit.out
>>> localhost: starting tasktracker, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-tasktracker-sujit.out
>>> 
>>> *hduser@sujit:~/Desktop/hadoop/bin$ jps*
>>> 11528 NameNode
>>> 12019 SecondaryNameNode
>>> 12355 TaskTracker
>>> 12115 JobTracker
>>> 12437 Jps
>>> 
>>> 
>>> *hduser@sujit:~/Desktop/hadoop/bin$ stop-all.sh*
>>> Warning: $HADOOP_HOME is deprecated.
>>> 
>>> stopping jobtracker
>>> localhost: stopping tasktracker
>>> stopping namenode
>>> localhost: no datanode to stop
>>> localhost: stopping secondarynamenode
>>> 
>>> 
>>> *hduser@sujit:~/Desktop/hadoop/bin$ jps*
>>> 13127 Jps
>>> 
>>> 
>>> *hduser@sujit:~/Desktop/hadoop/bin$ ls /tmp*
>>> hadoop-hduser-datanode.pid
>>> hsperfdata_hduser                        keyring-meecr7
>>> ssh-JXYCAJsX1324
>>> hadoop-hduser-jobtracker.pid
>>> hsperfdata_sujit                         plugtmp
>>> unity_support_test.0
>>> hadoop-hduser-namenode.pid
>>> Jetty_0_0_0_0_50030_job____yn7qmk        pulse-2L9K88eMlGn7
>>> virtual-hduser.Q8j5nJ
>>> hadoop-hduser-secondarynamenode.pid
>>> Jetty_0_0_0_0_50070_hdfs____w2cu08       pulse-Ob9vyJcXyHZz
>>> hadoop-hduser-tasktracker.pid
>>> Jetty_0_0_0_0_50090_secondary____y6aanv  pulse-PKdhtXMmr18n
>>> 
>>> *Deleted *.pid file :)
>>> 
>>> hduser@sujit:~$ ls /tmp*
>>> hsperfdata_hduser                        pulse-2L9K88eMlGn7
>>> hsperfdata_sujit                         pulse-Ob9vyJcXyHZz
>>> Jetty_0_0_0_0_50030_job____yn7qmk        pulse-PKdhtXMmr18n
>>> Jetty_0_0_0_0_50070_hdfs____w2cu08       ssh-JXYCAJsX1324
>>> Jetty_0_0_0_0_50090_secondary____y6aanv  unity_support_test.0
>>> keyring-meecr7                           virtual-hduser.Q8j5nJ
>>> plugtmp
>>> 
>>> 
>>> 
>>> 
>>> 
>>> *hduser@sujit:~/Desktop/hadoop$ bin/hadoop namenode -format*
>>> Warning: $HADOOP_HOME is deprecated.
>>> 
>>> 12/04/06 23:23:22 INFO namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = sujit.(null)/127.0.1.1
>>> STARTUP_MSG:   args = [-format]
>>> STARTUP_MSG:   version = 1.0.1
>>> STARTUP_MSG:   build =
>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>>> 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
>>> ************************************************************/
>>> Re-format filesystem in /app/hadoop/tmp/dfs/name ? (Y or N) Y
>>> 12/04/06 23:23:25 INFO util.GSet: VM type       = 32-bit
>>> 12/04/06 23:23:25 INFO util.GSet: 2% max memory = 17.77875 MB
>>> 12/04/06 23:23:25 INFO util.GSet: capacity      = 2^22 = 4194304 entries
>>> 12/04/06 23:23:25 INFO util.GSet: recommended=4194304, actual=4194304
>>> 12/04/06 23:23:25 INFO namenode.FSNamesystem: fsOwner=hduser
>>> 12/04/06 23:23:25 INFO namenode.FSNamesystem: supergroup=supergroup
>>> 12/04/06 23:23:25 INFO namenode.FSNamesystem: isPermissionEnabled=true
>>> 12/04/06 23:23:25 INFO namenode.FSNamesystem:
>> dfs.block.invalidate.limit=100
>>> 12/04/06 23:23:25 INFO namenode.FSNamesystem: isAccessTokenEnabled=false
>>> accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
>>> 12/04/06 23:23:25 INFO namenode.NameNode: Caching file names occuring
>> more
>>> than 10 times
>>> 12/04/06 23:23:26 INFO common.Storage: Image file of size 112 saved in 0
>>> seconds.
>>> 12/04/06 23:23:26 INFO common.Storage: Storage directory
>>> /app/hadoop/tmp/dfs/name has been successfully formatted.
>>> 12/04/06 23:23:26 INFO namenode.NameNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at sujit.(null)/127.0.1.1
>>> ************************************************************/
>>> hduser@sujit:~/Desktop/hadoop$ bin/start-all.sh
>>> Warning: $HADOOP_HOME is deprecated.
>>> 
>>> starting namenode, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-namenode-sujit.out
>>> localhost: starting datanode, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-datanode-sujit.out
>>> localhost: starting secondarynamenode, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-sujit.out
>>> starting jobtracker, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-jobtracker-sujit.out
>>> localhost: starting tasktracker, logging to
>>> 
>> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-tasktracker-sujit.out
>>> 
>>> 
>>> *hduser@sujit:~/Desktop/hadoop$ jps*
>>> 14157 JobTracker
>>> 14492 Jps
>>> 14397 TaskTracker
>>> 14063 SecondaryNameNode
>>> 13574 NameNode
>>> hduser@sujit:~/Desktop/hadoop$
>> 


Re: Data Node is not Started

Posted by Sujit Dhamale <su...@gmail.com>.
Below are DataNode logs









hduser@sujit:~/Desktop/hadoop/logs$ cat hadoop-hduser-datanode-sujit.log
2012-04-06 22:11:34,566 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = sujit.(null)/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
************************************************************/
2012-04-06 22:11:34,749 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2012-04-06 22:11:34,768 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
2012-04-06 22:11:34,769 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2012-04-06 22:11:34,769 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
started
2012-04-06 22:11:34,950 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
registered.
2012-04-06 22:11:34,956 WARN
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
exists!
2012-04-06 22:11:36,149 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
2012-04-06 22:11:41,923 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
= 387652554; datanode namespaceID = 1269725409
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)

2012-04-06 22:11:41,924 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
************************************************************/
2012-04-06 22:30:04,713 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = sujit.(null)/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
************************************************************/
2012-04-06 22:30:04,870 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2012-04-06 22:30:04,883 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
2012-04-06 22:30:04,884 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2012-04-06 22:30:04,884 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
started
2012-04-06 22:30:05,046 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
registered.
2012-04-06 22:30:05,051 WARN
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
exists!
2012-04-06 22:30:06,273 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
2012-04-06 22:30:11,073 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
= 361514980; datanode namespaceID = 1269725409
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)

2012-04-06 22:30:11,075 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
************************************************************/
2012-04-06 22:36:46,946 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = sujit.(null)/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
************************************************************/
2012-04-06 22:36:47,113 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2012-04-06 22:36:47,125 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
2012-04-06 22:36:47,126 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2012-04-06 22:36:47,126 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
started
2012-04-06 22:36:47,284 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
registered.
2012-04-06 22:36:47,290 WARN
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
exists!
2012-04-06 22:36:48,501 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
2012-04-06 22:36:53,401 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
= 361514980; datanode namespaceID = 1269725409
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)

2012-04-06 22:36:53,403 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
************************************************************/
2012-04-06 22:47:06,904 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = sujit.(null)/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
************************************************************/
2012-04-06 22:47:07,060 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2012-04-06 22:47:07,072 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
2012-04-06 22:47:07,073 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2012-04-06 22:47:07,073 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
started
2012-04-06 22:47:07,288 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
registered.
2012-04-06 22:47:07,299 WARN
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
exists!
2012-04-06 22:47:12,826 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
= 361514980; datanode namespaceID = 1269725409
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)

2012-04-06 22:47:12,828 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
************************************************************/
2012-04-06 23:11:15,062 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = sujit.(null)/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
************************************************************/
2012-04-06 23:11:15,220 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2012-04-06 23:11:15,234 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
2012-04-06 23:11:15,235 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2012-04-06 23:11:15,235 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
started
2012-04-06 23:11:15,402 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
registered.
2012-04-06 23:11:15,407 WARN
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
exists!
2012-04-06 23:11:15,713 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
= 361514980; datanode namespaceID = 1269725409
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)

2012-04-06 23:11:15,715 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
************************************************************/
2012-04-06 23:12:07,345 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = sujit.(null)/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
************************************************************/
2012-04-06 23:12:07,511 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2012-04-06 23:12:07,523 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
2012-04-06 23:12:07,524 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2012-04-06 23:12:07,524 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
started
2012-04-06 23:12:07,706 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
registered.
2012-04-06 23:12:07,712 WARN
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
exists!
2012-04-06 23:12:08,900 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
2012-04-06 23:12:13,765 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
= 361514980; datanode namespaceID = 1269725409
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)

2012-04-06 23:12:13,767 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
************************************************************/
2012-04-06 23:23:46,591 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = sujit.(null)/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
************************************************************/
2012-04-06 23:23:46,747 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2012-04-06 23:23:46,759 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
2012-04-06 23:23:46,760 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2012-04-06 23:23:46,760 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
started
2012-04-06 23:23:46,913 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
registered.
2012-04-06 23:23:46,919 WARN
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already
exists!
2012-04-06 23:23:48,136 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
2012-04-06 23:23:53,122 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
Incompatible namespaceIDs in /app/hadoop/tmp/dfs/data: namenode namespaceID
= 1320262146; datanode namespaceID = 1269725409
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:232)
    at
org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:147)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1665)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1682)

2012-04-06 23:23:53,124 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at sujit.(null)/127.0.1.1
************************************************************/
hduser@sujit:~/Desktop/hadoop/logs$ A




2 at 11:46 PM, Prashant Kommireddi <pr...@gmail.com> wrote:

> Can you check the datanode logs? May  its an incompatible namespace issue.
>
> On Apr 6, 2012, at 11:13 AM, Sujit Dhamale <su...@gmail.com>
> wrote:
>
> > Hi all,
> > my DataNode is not started .
> >
> > even after deleting hadoop*.pid file from /tmp , But still Data node is
> not
> > started ,
> >
> >
> > Hadoop Version: hadoop-1.0.1.tar.gz
> > Java version : java version "1.6.0_26
> > Operating System : Ubuntu 11.10
> >
> >
> > i did below procedure
> >
> >
> > *hduser@sujit:~/Desktop/hadoop/bin$ jps*
> > 11455 Jps
> >
> >
> > *hduser@sujit:~/Desktop/hadoop/bin$ start-all.sh*
> > Warning: $HADOOP_HOME is deprecated.
> >
> > starting namenode, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-namenode-sujit.out
> > localhost: starting datanode, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-datanode-sujit.out
> > localhost: starting secondarynamenode, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-sujit.out
> > starting jobtracker, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-jobtracker-sujit.out
> > localhost: starting tasktracker, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-tasktracker-sujit.out
> >
> > *hduser@sujit:~/Desktop/hadoop/bin$ jps*
> > 11528 NameNode
> > 12019 SecondaryNameNode
> > 12355 TaskTracker
> > 12115 JobTracker
> > 12437 Jps
> >
> >
> > *hduser@sujit:~/Desktop/hadoop/bin$ stop-all.sh*
> > Warning: $HADOOP_HOME is deprecated.
> >
> > stopping jobtracker
> > localhost: stopping tasktracker
> > stopping namenode
> > localhost: no datanode to stop
> > localhost: stopping secondarynamenode
> >
> >
> > *hduser@sujit:~/Desktop/hadoop/bin$ jps*
> > 13127 Jps
> >
> >
> > *hduser@sujit:~/Desktop/hadoop/bin$ ls /tmp*
> > hadoop-hduser-datanode.pid
> > hsperfdata_hduser                        keyring-meecr7
> > ssh-JXYCAJsX1324
> > hadoop-hduser-jobtracker.pid
> > hsperfdata_sujit                         plugtmp
> > unity_support_test.0
> > hadoop-hduser-namenode.pid
> > Jetty_0_0_0_0_50030_job____yn7qmk        pulse-2L9K88eMlGn7
> > virtual-hduser.Q8j5nJ
> > hadoop-hduser-secondarynamenode.pid
> > Jetty_0_0_0_0_50070_hdfs____w2cu08       pulse-Ob9vyJcXyHZz
> > hadoop-hduser-tasktracker.pid
> > Jetty_0_0_0_0_50090_secondary____y6aanv  pulse-PKdhtXMmr18n
> >
> > *Deleted *.pid file :)
> >
> > hduser@sujit:~$ ls /tmp*
> > hsperfdata_hduser                        pulse-2L9K88eMlGn7
> > hsperfdata_sujit                         pulse-Ob9vyJcXyHZz
> > Jetty_0_0_0_0_50030_job____yn7qmk        pulse-PKdhtXMmr18n
> > Jetty_0_0_0_0_50070_hdfs____w2cu08       ssh-JXYCAJsX1324
> > Jetty_0_0_0_0_50090_secondary____y6aanv  unity_support_test.0
> > keyring-meecr7                           virtual-hduser.Q8j5nJ
> > plugtmp
> >
> >
> >
> >
> >
> > *hduser@sujit:~/Desktop/hadoop$ bin/hadoop namenode -format*
> > Warning: $HADOOP_HOME is deprecated.
> >
> > 12/04/06 23:23:22 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = sujit.(null)/127.0.1.1
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.0.1
> > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> > 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
> > ************************************************************/
> > Re-format filesystem in /app/hadoop/tmp/dfs/name ? (Y or N) Y
> > 12/04/06 23:23:25 INFO util.GSet: VM type       = 32-bit
> > 12/04/06 23:23:25 INFO util.GSet: 2% max memory = 17.77875 MB
> > 12/04/06 23:23:25 INFO util.GSet: capacity      = 2^22 = 4194304 entries
> > 12/04/06 23:23:25 INFO util.GSet: recommended=4194304, actual=4194304
> > 12/04/06 23:23:25 INFO namenode.FSNamesystem: fsOwner=hduser
> > 12/04/06 23:23:25 INFO namenode.FSNamesystem: supergroup=supergroup
> > 12/04/06 23:23:25 INFO namenode.FSNamesystem: isPermissionEnabled=true
> > 12/04/06 23:23:25 INFO namenode.FSNamesystem:
> dfs.block.invalidate.limit=100
> > 12/04/06 23:23:25 INFO namenode.FSNamesystem: isAccessTokenEnabled=false
> > accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
> > 12/04/06 23:23:25 INFO namenode.NameNode: Caching file names occuring
> more
> > than 10 times
> > 12/04/06 23:23:26 INFO common.Storage: Image file of size 112 saved in 0
> > seconds.
> > 12/04/06 23:23:26 INFO common.Storage: Storage directory
> > /app/hadoop/tmp/dfs/name has been successfully formatted.
> > 12/04/06 23:23:26 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at sujit.(null)/127.0.1.1
> > ************************************************************/
> > hduser@sujit:~/Desktop/hadoop$ bin/start-all.sh
> > Warning: $HADOOP_HOME is deprecated.
> >
> > starting namenode, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-namenode-sujit.out
> > localhost: starting datanode, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-datanode-sujit.out
> > localhost: starting secondarynamenode, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-sujit.out
> > starting jobtracker, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-jobtracker-sujit.out
> > localhost: starting tasktracker, logging to
> >
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-tasktracker-sujit.out
> >
> >
> > *hduser@sujit:~/Desktop/hadoop$ jps*
> > 14157 JobTracker
> > 14492 Jps
> > 14397 TaskTracker
> > 14063 SecondaryNameNode
> > 13574 NameNode
> > hduser@sujit:~/Desktop/hadoop$
>

Re: Data Node is not Started

Posted by Prashant Kommireddi <pr...@gmail.com>.
Can you check the datanode logs? May  its an incompatible namespace issue.

On Apr 6, 2012, at 11:13 AM, Sujit Dhamale <su...@gmail.com> wrote:

> Hi all,
> my DataNode is not started .
>
> even after deleting hadoop*.pid file from /tmp , But still Data node is not
> started ,
>
>
> Hadoop Version: hadoop-1.0.1.tar.gz
> Java version : java version "1.6.0_26
> Operating System : Ubuntu 11.10
>
>
> i did below procedure
>
>
> *hduser@sujit:~/Desktop/hadoop/bin$ jps*
> 11455 Jps
>
>
> *hduser@sujit:~/Desktop/hadoop/bin$ start-all.sh*
> Warning: $HADOOP_HOME is deprecated.
>
> starting namenode, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-namenode-sujit.out
> localhost: starting datanode, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-datanode-sujit.out
> localhost: starting secondarynamenode, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-sujit.out
> starting jobtracker, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-jobtracker-sujit.out
> localhost: starting tasktracker, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-tasktracker-sujit.out
>
> *hduser@sujit:~/Desktop/hadoop/bin$ jps*
> 11528 NameNode
> 12019 SecondaryNameNode
> 12355 TaskTracker
> 12115 JobTracker
> 12437 Jps
>
>
> *hduser@sujit:~/Desktop/hadoop/bin$ stop-all.sh*
> Warning: $HADOOP_HOME is deprecated.
>
> stopping jobtracker
> localhost: stopping tasktracker
> stopping namenode
> localhost: no datanode to stop
> localhost: stopping secondarynamenode
>
>
> *hduser@sujit:~/Desktop/hadoop/bin$ jps*
> 13127 Jps
>
>
> *hduser@sujit:~/Desktop/hadoop/bin$ ls /tmp*
> hadoop-hduser-datanode.pid
> hsperfdata_hduser                        keyring-meecr7
> ssh-JXYCAJsX1324
> hadoop-hduser-jobtracker.pid
> hsperfdata_sujit                         plugtmp
> unity_support_test.0
> hadoop-hduser-namenode.pid
> Jetty_0_0_0_0_50030_job____yn7qmk        pulse-2L9K88eMlGn7
> virtual-hduser.Q8j5nJ
> hadoop-hduser-secondarynamenode.pid
> Jetty_0_0_0_0_50070_hdfs____w2cu08       pulse-Ob9vyJcXyHZz
> hadoop-hduser-tasktracker.pid
> Jetty_0_0_0_0_50090_secondary____y6aanv  pulse-PKdhtXMmr18n
>
> *Deleted *.pid file :)
>
> hduser@sujit:~$ ls /tmp*
> hsperfdata_hduser                        pulse-2L9K88eMlGn7
> hsperfdata_sujit                         pulse-Ob9vyJcXyHZz
> Jetty_0_0_0_0_50030_job____yn7qmk        pulse-PKdhtXMmr18n
> Jetty_0_0_0_0_50070_hdfs____w2cu08       ssh-JXYCAJsX1324
> Jetty_0_0_0_0_50090_secondary____y6aanv  unity_support_test.0
> keyring-meecr7                           virtual-hduser.Q8j5nJ
> plugtmp
>
>
>
>
>
> *hduser@sujit:~/Desktop/hadoop$ bin/hadoop namenode -format*
> Warning: $HADOOP_HOME is deprecated.
>
> 12/04/06 23:23:22 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = sujit.(null)/127.0.1.1
> STARTUP_MSG:   args = [-format]
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1243785; compiled by 'hortonfo' on Tue Feb 14 08:15:38 UTC 2012
> ************************************************************/
> Re-format filesystem in /app/hadoop/tmp/dfs/name ? (Y or N) Y
> 12/04/06 23:23:25 INFO util.GSet: VM type       = 32-bit
> 12/04/06 23:23:25 INFO util.GSet: 2% max memory = 17.77875 MB
> 12/04/06 23:23:25 INFO util.GSet: capacity      = 2^22 = 4194304 entries
> 12/04/06 23:23:25 INFO util.GSet: recommended=4194304, actual=4194304
> 12/04/06 23:23:25 INFO namenode.FSNamesystem: fsOwner=hduser
> 12/04/06 23:23:25 INFO namenode.FSNamesystem: supergroup=supergroup
> 12/04/06 23:23:25 INFO namenode.FSNamesystem: isPermissionEnabled=true
> 12/04/06 23:23:25 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
> 12/04/06 23:23:25 INFO namenode.FSNamesystem: isAccessTokenEnabled=false
> accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
> 12/04/06 23:23:25 INFO namenode.NameNode: Caching file names occuring more
> than 10 times
> 12/04/06 23:23:26 INFO common.Storage: Image file of size 112 saved in 0
> seconds.
> 12/04/06 23:23:26 INFO common.Storage: Storage directory
> /app/hadoop/tmp/dfs/name has been successfully formatted.
> 12/04/06 23:23:26 INFO namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at sujit.(null)/127.0.1.1
> ************************************************************/
> hduser@sujit:~/Desktop/hadoop$ bin/start-all.sh
> Warning: $HADOOP_HOME is deprecated.
>
> starting namenode, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-namenode-sujit.out
> localhost: starting datanode, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-datanode-sujit.out
> localhost: starting secondarynamenode, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-sujit.out
> starting jobtracker, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-jobtracker-sujit.out
> localhost: starting tasktracker, logging to
> /home/hduser/Desktop/hadoop/libexec/../logs/hadoop-hduser-tasktracker-sujit.out
>
>
> *hduser@sujit:~/Desktop/hadoop$ jps*
> 14157 JobTracker
> 14492 Jps
> 14397 TaskTracker
> 14063 SecondaryNameNode
> 13574 NameNode
> hduser@sujit:~/Desktop/hadoop$