You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@nutch.apache.org by cybercouf <cy...@free.fr> on 2007/05/02 14:40:00 UTC
nutch and hadoop: can't launch properly the name node
I'm trying to setup hadoop using these guides:
http://wiki.apache.org/nutch/NutchHadoopTutorial and
http://www.nabble.com/Nutch-Step-by-Step-Maybe-someone-will-find-this-useful---tf3526281.html
But i'm stuck at the early step: having a single machine running.
using nutch 0.8.1 and so the provided hadoop "hadoop-0.4.0-patched.jar"
JVM sun 1.5.0_11
When I start the namenode (using ./bin/start-all.sh) I have this in the
namenode-log:
2007-05-02 12:39:51,335 INFO util.Credential - Checking Resource aliases
2007-05-02 12:39:51,349 INFO http.HttpServer - Version Jetty/5.1.4
2007-05-02 12:39:51,350 WARN servlet.WebApplicationContext - Web
application not found
/home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
2007-05-02 12:39:51,351 WARN servlet.WebApplicationContext - Configuration
error on
/home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
java.io.FileNotFoundException:
/home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
at
org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
at
org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
at org.mortbay.util.Container.start(Container.java:72)
at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
at org.mortbay.util.Container.start(Container.java:72)
at
org.apache.hadoop.mapred.StatusHttpServer.start(StatusHttpServer.java:138)
at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:173)
at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:91)
at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:82)
at org.apache.hadoop.dfs.NameNode.main(NameNode.java:491)
2007-05-02 12:39:51,353 INFO util.Container - Started
HttpContext[/logs,/logs]
2007-05-02 12:39:51,353 INFO util.Container - Started
HttpContext[/static,/static]
2007-05-02 12:39:51,357 INFO http.SocketListener - Started SocketListener
on 0.0.0.0:50070
and after I can't access it:
$ ./bin/hadoop dfs -ls
ls: Connection refused
hadoop.log:
2007-05-02 12:41:40,030 WARN fs.DFSClient - Problem renewing lease for
DFSClient_2015604182: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
[...]
1. I can't understant why there is this FileNotFound execption, I didn't
change anything in the hadoop nutch jar file.
2. It looks like the namenode is running (when I stop it I have the message
"stopping namenode"), but why I can't access it ? (is this ip from the log
correct? 0.0.0.0:50070)
all is on the same machine, and my conf file looks ok:
fs.default.name myhostname:9000
mapred.job.tracker myhostname:9001
mapred.map.tasks 2
mapred.reduce.tasks 2
dfs.name.dir /home/nutch/filesystem/name
dfs.data.dir /home/nutch/filesystem/data
mapred.system.dir /home/nutch/filesystem/mapreduce/system
mapred.local.dir /home/nutch/filesystem/mapreduce/local
dfs.replication 1
--
View this message in context: http://www.nabble.com/nutch-and-hadoop%3A-can%27t-launch-properly-the-name-node-tf3680311.html#a10285097
Sent from the Nutch - User mailing list archive at Nabble.com.
Re: nutch and hadoop: can't launch properly the name node
Posted by cybercouf <cy...@free.fr>.
The exclamation point means it's looking into the jar file.
Finally it's working.
I extracted all the files /webapps/* from the hadoop jarfile to the root of
my project.
So because hadoop couln't find this files (but he was looking at the good
path! so it's realy bizard), he threw an IOException who fails to launched
properly the namenode. And as netstats showed, the port 90000 was not open.
Dennis Kubes wrote:
>
> Is your hadoop jar in the lib directory named
> "hadoop-0.4.0-patched.jar!" with the exclamation point? If it is, that
> may be causing the error. Also let me know if you can ping the namenode
> from any of the data nodes.
>
> Dennis Kubes
>
> cybercouf wrote:
>> I tried both with "localhost" or "myhostname.domaine" in the slaves file,
>> but
>> still the same problem.
>> I used the DEBUG log mode in order to find a clue.
>>
>>
>> hadoop-nutch-datanode.log
>> 2007-05-02 17:27:57,575 DEBUG conf.Configuration - java.io.IOException:
>> config()
>> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
>> at org.apache.hadoop.dfs.DataNode.main(DataNode.java:951)
>> 2007-05-02 17:27:57,709 INFO conf.Configuration - parsing
>> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
>> 2007-05-02 17:27:57,773 INFO conf.Configuration - parsing
>> file:/home/nutch/search/conf/hadoop-site.xml
>> 2007-05-02 17:27:57,814 INFO dfs.DataNode - Opened server at 50010
>> 2007-05-02 17:27:57,838 INFO dfs.DataNode - Namenode not available yet,
>> Zzzzz...
>> 2007-05-02 17:28:07,848 INFO dfs.DataNode - Namenode not available yet,
>> Zzzzz...
>>
>> hadoop-nutch-namenode.log
>> 2007-05-02 17:27:56,422 DEBUG conf.Configuration - java.io.IOException:
>> config()
>> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
>> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:475)
>> 2007-05-02 17:27:56,553 INFO conf.Configuration - parsing
>> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
>> 2007-05-02 17:27:56,618 INFO conf.Configuration - parsing
>> file:/home/nutch/search/conf/hadoop-site.xml
>> 2007-05-02 17:27:56,716 INFO util.Credential - Checking Resource aliases
>> 2007-05-02 17:27:56,729 INFO http.HttpServer - Version Jetty/5.1.4
>> 2007-05-02 17:27:56,730 WARN servlet.WebApplicationContext - Web
>> application not found
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
>> 2007-05-02 17:27:56,731 WARN servlet.WebApplicationContext -
>> Configuration
>> error on
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
>> java.io.FileNotFoundException:
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
>> at
>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>> [...]
>> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:491)
>> 2007-05-02 17:27:56,732 INFO util.Container - Started
>> HttpContext[/logs,/logs]
>> 2007-05-02 17:27:56,732 INFO util.Container - Started
>> HttpContext[/static,/static]
>> 2007-05-02 17:27:56,736 INFO http.SocketListener - Started
>> SocketListener
>> on 0.0.0.0:50070
>>
>> hadoop-nutch-tasktracker.log
>> 2007-05-02 17:27:59,783 DEBUG conf.Configuration - java.io.IOException:
>> config()
>> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
>> at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:66)
>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:1129)
>> 2007-05-02 17:27:59,934 INFO conf.Configuration - parsing
>> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
>> 2007-05-02 17:28:00,038 INFO conf.Configuration - parsing
>> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/mapred-default.xml
>> 2007-05-02 17:28:00,040 INFO conf.Configuration - parsing
>> file:/home/nutch/search/conf/hadoop-site.xml
>> 2007-05-02 17:28:00,156 INFO util.Credential - Checking Resource aliases
>> 2007-05-02 17:28:00,169 INFO http.HttpServer - Version Jetty/5.1.4
>> 2007-05-02 17:28:00,170 WARN servlet.WebApplicationContext - Web
>> application not found
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/task
>> 2007-05-02 17:28:00,171 WARN servlet.WebApplicationContext -
>> Configuration
>> error on
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/task
>> java.io.FileNotFoundException:
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/task
>> at
>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>> [...]
>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:1130)
>> 2007-05-02 17:28:00,172 INFO util.Container - Started
>> HttpContext[/logs,/logs]
>> 2007-05-02 17:28:00,173 INFO util.Container - Started
>> HttpContext[/static,/static]
>> 2007-05-02 17:28:00,176 INFO http.SocketListener - Started
>> SocketListener
>> on 0.0.0.0:50060
>> 2007-05-02 17:28:00,176 WARN mapred.TaskTracker - Can not start task
>> tracker because Problem starting http server
>>
>> hadoop-nutch-jobtracker.log
>> 2007-05-02 17:27:58,631 DEBUG conf.Configuration - java.io.IOException:
>> config()
>> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1132)
>> 2007-05-02 17:27:58,785 INFO conf.Configuration - parsing
>> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
>> 2007-05-02 17:27:58,862 INFO conf.Configuration - parsing
>> file:/home/nutch/search/conf/hadoop-site.xml
>> 2007-05-02 17:27:58,893 DEBUG conf.Configuration - java.io.IOException:
>> config(config)
>> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:76)
>> at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:86)
>> at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:427)
>> at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:65)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1133)
>> 2007-05-02 17:27:58,896 INFO conf.Configuration - parsing
>> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
>> 2007-05-02 17:27:58,907 INFO conf.Configuration - parsing
>> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/mapred-default.xml
>> 2007-05-02 17:27:58,922 INFO conf.Configuration - parsing
>> file:/home/nutch/search/conf/hadoop-site.xml
>> 2007-05-02 17:27:58,989 WARN mapred.JobTracker - Starting tracker
>> java.net.ConnectException: Connection refused
>> at java.net.PlainSocketImpl.socketConnect(Native Method)
>> [...]
>> at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:65)
>> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1133)
>> 2007-05-02 17:27:58,990 WARN fs.DFSClient - Problem renewing lease for
>> DFSClient_515895079: java.net.ConnectException: Connection refused
>> at java.net.PlainSocketImpl.socketConnect(Native Method)
>> [...]
>> at org.apache.hadoop.dfs.$Proxy0.renewLease(Unknown Source)
>> at org.apache.hadoop.dfs.DFSClient$LeaseChecker.run(DFSClient.java:437)
>> [...]
>>
>>
>>
>>
>> What errors are you seeing in your hadoop-namenode and datanode logs?
>>
>> Dennis Kubes
>>
>
>
--
View this message in context: http://www.nabble.com/nutch-and-hadoop%3A-can%27t-launch-properly-the-name-node-tf3680311.html#a10309064
Sent from the Nutch - User mailing list archive at Nabble.com.
Re: nutch and hadoop: can't launch properly the name node
Posted by Dennis Kubes <nu...@dragonflymc.com>.
Is your hadoop jar in the lib directory named
"hadoop-0.4.0-patched.jar!" with the exclamation point? If it is, that
may be causing the error. Also let me know if you can ping the namenode
from any of the data nodes.
Dennis Kubes
cybercouf wrote:
> I tried both with "localhost" or "myhostname.domaine" in the slaves file, but
> still the same problem.
> I used the DEBUG log mode in order to find a clue.
>
>
> hadoop-nutch-datanode.log
> 2007-05-02 17:27:57,575 DEBUG conf.Configuration - java.io.IOException:
> config()
> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
> at org.apache.hadoop.dfs.DataNode.main(DataNode.java:951)
> 2007-05-02 17:27:57,709 INFO conf.Configuration - parsing
> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
> 2007-05-02 17:27:57,773 INFO conf.Configuration - parsing
> file:/home/nutch/search/conf/hadoop-site.xml
> 2007-05-02 17:27:57,814 INFO dfs.DataNode - Opened server at 50010
> 2007-05-02 17:27:57,838 INFO dfs.DataNode - Namenode not available yet,
> Zzzzz...
> 2007-05-02 17:28:07,848 INFO dfs.DataNode - Namenode not available yet,
> Zzzzz...
>
> hadoop-nutch-namenode.log
> 2007-05-02 17:27:56,422 DEBUG conf.Configuration - java.io.IOException:
> config()
> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:475)
> 2007-05-02 17:27:56,553 INFO conf.Configuration - parsing
> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
> 2007-05-02 17:27:56,618 INFO conf.Configuration - parsing
> file:/home/nutch/search/conf/hadoop-site.xml
> 2007-05-02 17:27:56,716 INFO util.Credential - Checking Resource aliases
> 2007-05-02 17:27:56,729 INFO http.HttpServer - Version Jetty/5.1.4
> 2007-05-02 17:27:56,730 WARN servlet.WebApplicationContext - Web
> application not found
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
> 2007-05-02 17:27:56,731 WARN servlet.WebApplicationContext - Configuration
> error on
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
> java.io.FileNotFoundException:
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
> at
> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
> [...]
> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:491)
> 2007-05-02 17:27:56,732 INFO util.Container - Started
> HttpContext[/logs,/logs]
> 2007-05-02 17:27:56,732 INFO util.Container - Started
> HttpContext[/static,/static]
> 2007-05-02 17:27:56,736 INFO http.SocketListener - Started SocketListener
> on 0.0.0.0:50070
>
> hadoop-nutch-tasktracker.log
> 2007-05-02 17:27:59,783 DEBUG conf.Configuration - java.io.IOException:
> config()
> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
> at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:66)
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:1129)
> 2007-05-02 17:27:59,934 INFO conf.Configuration - parsing
> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
> 2007-05-02 17:28:00,038 INFO conf.Configuration - parsing
> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/mapred-default.xml
> 2007-05-02 17:28:00,040 INFO conf.Configuration - parsing
> file:/home/nutch/search/conf/hadoop-site.xml
> 2007-05-02 17:28:00,156 INFO util.Credential - Checking Resource aliases
> 2007-05-02 17:28:00,169 INFO http.HttpServer - Version Jetty/5.1.4
> 2007-05-02 17:28:00,170 WARN servlet.WebApplicationContext - Web
> application not found
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/task
> 2007-05-02 17:28:00,171 WARN servlet.WebApplicationContext - Configuration
> error on
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/task
> java.io.FileNotFoundException:
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/task
> at
> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
> [...]
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:1130)
> 2007-05-02 17:28:00,172 INFO util.Container - Started
> HttpContext[/logs,/logs]
> 2007-05-02 17:28:00,173 INFO util.Container - Started
> HttpContext[/static,/static]
> 2007-05-02 17:28:00,176 INFO http.SocketListener - Started SocketListener
> on 0.0.0.0:50060
> 2007-05-02 17:28:00,176 WARN mapred.TaskTracker - Can not start task
> tracker because Problem starting http server
>
> hadoop-nutch-jobtracker.log
> 2007-05-02 17:27:58,631 DEBUG conf.Configuration - java.io.IOException:
> config()
> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1132)
> 2007-05-02 17:27:58,785 INFO conf.Configuration - parsing
> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
> 2007-05-02 17:27:58,862 INFO conf.Configuration - parsing
> file:/home/nutch/search/conf/hadoop-site.xml
> 2007-05-02 17:27:58,893 DEBUG conf.Configuration - java.io.IOException:
> config(config)
> at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:76)
> at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:86)
> at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:427)
> at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:65)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1133)
> 2007-05-02 17:27:58,896 INFO conf.Configuration - parsing
> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
> 2007-05-02 17:27:58,907 INFO conf.Configuration - parsing
> jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/mapred-default.xml
> 2007-05-02 17:27:58,922 INFO conf.Configuration - parsing
> file:/home/nutch/search/conf/hadoop-site.xml
> 2007-05-02 17:27:58,989 WARN mapred.JobTracker - Starting tracker
> java.net.ConnectException: Connection refused
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> [...]
> at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:65)
> at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1133)
> 2007-05-02 17:27:58,990 WARN fs.DFSClient - Problem renewing lease for
> DFSClient_515895079: java.net.ConnectException: Connection refused
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> [...]
> at org.apache.hadoop.dfs.$Proxy0.renewLease(Unknown Source)
> at org.apache.hadoop.dfs.DFSClient$LeaseChecker.run(DFSClient.java:437)
> [...]
>
>
>
>
> What errors are you seeing in your hadoop-namenode and datanode logs?
>
> Dennis Kubes
>
Re: nutch and hadoop: can't launch properly the name node
Posted by cybercouf <cy...@free.fr>.
I tried both with "localhost" or "myhostname.domaine" in the slaves file, but
still the same problem.
I used the DEBUG log mode in order to find a clue.
hadoop-nutch-datanode.log
2007-05-02 17:27:57,575 DEBUG conf.Configuration - java.io.IOException:
config()
at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
at org.apache.hadoop.dfs.DataNode.main(DataNode.java:951)
2007-05-02 17:27:57,709 INFO conf.Configuration - parsing
jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
2007-05-02 17:27:57,773 INFO conf.Configuration - parsing
file:/home/nutch/search/conf/hadoop-site.xml
2007-05-02 17:27:57,814 INFO dfs.DataNode - Opened server at 50010
2007-05-02 17:27:57,838 INFO dfs.DataNode - Namenode not available yet,
Zzzzz...
2007-05-02 17:28:07,848 INFO dfs.DataNode - Namenode not available yet,
Zzzzz...
hadoop-nutch-namenode.log
2007-05-02 17:27:56,422 DEBUG conf.Configuration - java.io.IOException:
config()
at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
at org.apache.hadoop.dfs.NameNode.main(NameNode.java:475)
2007-05-02 17:27:56,553 INFO conf.Configuration - parsing
jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
2007-05-02 17:27:56,618 INFO conf.Configuration - parsing
file:/home/nutch/search/conf/hadoop-site.xml
2007-05-02 17:27:56,716 INFO util.Credential - Checking Resource aliases
2007-05-02 17:27:56,729 INFO http.HttpServer - Version Jetty/5.1.4
2007-05-02 17:27:56,730 WARN servlet.WebApplicationContext - Web
application not found
/home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
2007-05-02 17:27:56,731 WARN servlet.WebApplicationContext - Configuration
error on
/home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
java.io.FileNotFoundException:
/home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
at
org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
[...]
at org.apache.hadoop.dfs.NameNode.main(NameNode.java:491)
2007-05-02 17:27:56,732 INFO util.Container - Started
HttpContext[/logs,/logs]
2007-05-02 17:27:56,732 INFO util.Container - Started
HttpContext[/static,/static]
2007-05-02 17:27:56,736 INFO http.SocketListener - Started SocketListener
on 0.0.0.0:50070
hadoop-nutch-tasktracker.log
2007-05-02 17:27:59,783 DEBUG conf.Configuration - java.io.IOException:
config()
at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:66)
at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:1129)
2007-05-02 17:27:59,934 INFO conf.Configuration - parsing
jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
2007-05-02 17:28:00,038 INFO conf.Configuration - parsing
jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/mapred-default.xml
2007-05-02 17:28:00,040 INFO conf.Configuration - parsing
file:/home/nutch/search/conf/hadoop-site.xml
2007-05-02 17:28:00,156 INFO util.Credential - Checking Resource aliases
2007-05-02 17:28:00,169 INFO http.HttpServer - Version Jetty/5.1.4
2007-05-02 17:28:00,170 WARN servlet.WebApplicationContext - Web
application not found
/home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/task
2007-05-02 17:28:00,171 WARN servlet.WebApplicationContext - Configuration
error on
/home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/task
java.io.FileNotFoundException:
/home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/task
at
org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
[...]
at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:1130)
2007-05-02 17:28:00,172 INFO util.Container - Started
HttpContext[/logs,/logs]
2007-05-02 17:28:00,173 INFO util.Container - Started
HttpContext[/static,/static]
2007-05-02 17:28:00,176 INFO http.SocketListener - Started SocketListener
on 0.0.0.0:50060
2007-05-02 17:28:00,176 WARN mapred.TaskTracker - Can not start task
tracker because Problem starting http server
hadoop-nutch-jobtracker.log
2007-05-02 17:27:58,631 DEBUG conf.Configuration - java.io.IOException:
config()
at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:67)
at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1132)
2007-05-02 17:27:58,785 INFO conf.Configuration - parsing
jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
2007-05-02 17:27:58,862 INFO conf.Configuration - parsing
file:/home/nutch/search/conf/hadoop-site.xml
2007-05-02 17:27:58,893 DEBUG conf.Configuration - java.io.IOException:
config(config)
at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:76)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:86)
at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:427)
at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:65)
at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1133)
2007-05-02 17:27:58,896 INFO conf.Configuration - parsing
jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/hadoop-default.xml
2007-05-02 17:27:58,907 INFO conf.Configuration - parsing
jar:file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/mapred-default.xml
2007-05-02 17:27:58,922 INFO conf.Configuration - parsing
file:/home/nutch/search/conf/hadoop-site.xml
2007-05-02 17:27:58,989 WARN mapred.JobTracker - Starting tracker
java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
[...]
at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:65)
at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1133)
2007-05-02 17:27:58,990 WARN fs.DFSClient - Problem renewing lease for
DFSClient_515895079: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
[...]
at org.apache.hadoop.dfs.$Proxy0.renewLease(Unknown Source)
at org.apache.hadoop.dfs.DFSClient$LeaseChecker.run(DFSClient.java:437)
[...]
What errors are you seeing in your hadoop-namenode and datanode logs?
Dennis Kubes
--
View this message in context: http://www.nabble.com/nutch-and-hadoop%3A-can%27t-launch-properly-the-name-node-tf3680311.html#a10289878
Sent from the Nutch - User mailing list archive at Nabble.com.
Re: nutch and hadoop: can't launch properly the name node
Posted by Dennis Kubes <nu...@dragonflymc.com>.
What errors are you seeing in your hadoop-namenode and datanode logs?
Dennis Kubes
cybercouf wrote:
> Yes it is.
>
> Here more details:
>
> $ cat /etc/hosts
> 127.0.0.1 localhost
> 84.x.x.x myhostname.mydomain.com myhostname
>
> # ping myhostname
> PING myhostname.mydomain.com (84.x.x.x) 56(84) bytes of data.
> 64 bytes from myhostname.mydomain.com (84.x.x.x): icmp_seq=1 ttl=64
> time=0.017 ms
>
> and when i do start-all.sh, the namenode seems running:
> # netstat -tupl
> Active Internet connections (only servers)
> Proto Recv-Q Send-Q Local Address Foreign Address State
> PID/Program name
> tcp6 0 0 *:50070 *:* LISTEN
> 18241/java
> tcp6 0 0 *:ssh *:* LISTEN
> 3350/sshd
> tcp6 0 0 *:50010 *:* LISTEN
> 18279/java
>
> and also I noticed that my nutch user (from who i launch all the scripts) is
> not allowed to ping
> nutch:~/search$ ping myhostname
> ping: icmp open socket: Operation not permitted
>
> but that shouldn't be linked to the fact to failed to open a java socket?
> (java.net.ConnectException: Connection refused)
>
> thanks for your help!
>
>
> Dennis Kubes wrote:
>> Make sure your hosts file on your namenode is setup correctly:
>>
>> 127.0.0.1 localhost.localdomain localhost
>> 10.x.x.x myhostname.mydomain.com myhostname
>>
>> As opposed to:
>>
>> 127.0.0.1 localhost.localdomain localhost
>> myhostname.mydomain.com myhostname
>>
>> The problem may be that the machine is listening on only the local
>> interface. If you do a ping myhostname from the local box you should
>> receive the real IP and not the loopback address.
>>
>> Let me know if this was the problem or if you need more help.
>>
>> Dennis Kubes
>>
>> cybercouf wrote:
>>> I'm trying to setup hadoop using these guides:
>>> http://wiki.apache.org/nutch/NutchHadoopTutorial and
>>>
>> http://www.nabble.com/Nutch-Step-by-Step-Maybe-someone-will-find-this-useful---tf3526281.html
>>> But i'm stuck at the early step: having a single machine running.
>>> using nutch 0.8.1 and so the provided hadoop "hadoop-0.4.0-patched.jar"
>>> JVM sun 1.5.0_11
>>>
>>> When I start the namenode (using ./bin/start-all.sh) I have this in the
>>> namenode-log:
>>>
>>> 2007-05-02 12:39:51,335 INFO util.Credential - Checking Resource
>> aliases
>>> 2007-05-02 12:39:51,349 INFO http.HttpServer - Version Jetty/5.1.4
>>> 2007-05-02 12:39:51,350 WARN servlet.WebApplicationContext - Web
>>> application not found
>>>
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
>>> 2007-05-02 12:39:51,351 WARN servlet.WebApplicationContext -
>> Configuration
>>> error on
>>>
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
>>> java.io.FileNotFoundException:
>>>
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
>>> at
>>>
>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>>> at
>>>
>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>>> at org.mortbay.util.Container.start(Container.java:72)
>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>>> at org.mortbay.util.Container.start(Container.java:72)
>>> at
>>>
>> org.apache.hadoop.mapred.StatusHttpServer.start(StatusHttpServer.java:138)
>>> at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:173)
>>> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:91)
>>> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:82)
>>> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:491)
>>> 2007-05-02 12:39:51,353 INFO util.Container - Started
>>> HttpContext[/logs,/logs]
>>> 2007-05-02 12:39:51,353 INFO util.Container - Started
>>> HttpContext[/static,/static]
>>> 2007-05-02 12:39:51,357 INFO http.SocketListener - Started
>> SocketListener
>>> on 0.0.0.0:50070
>>>
>>> and after I can't access it:
>>> $ ./bin/hadoop dfs -ls
>>> ls: Connection refused
>>>
>>> hadoop.log:
>>> 2007-05-02 12:41:40,030 WARN fs.DFSClient - Problem renewing lease for
>>> DFSClient_2015604182: java.net.ConnectException: Connection refused
>>> at java.net.PlainSocketImpl.socketConnect(Native Method)
>>> [...]
>>>
>>>
>>>
>>> 1. I can't understant why there is this FileNotFound execption, I didn't
>>> change anything in the hadoop nutch jar file.
>>>
>>> 2. It looks like the namenode is running (when I stop it I have the
>> message
>>> "stopping namenode"), but why I can't access it ? (is this ip from the
>> log
>>> correct? 0.0.0.0:50070)
>>> all is on the same machine, and my conf file looks ok:
>>> fs.default.name myhostname:9000
>>> mapred.job.tracker myhostname:9001
>>> mapred.map.tasks 2
>>> mapred.reduce.tasks 2
>>> dfs.name.dir /home/nutch/filesystem/name
>>> dfs.data.dir /home/nutch/filesystem/data
>>> mapred.system.dir /home/nutch/filesystem/mapreduce/system
>>> mapred.local.dir /home/nutch/filesystem/mapreduce/local
>>> dfs.replication 1
>
Re: nutch and hadoop: can't launch properly the name node
Posted by cybercouf <cy...@free.fr>.
Yes it is.
Here more details:
$ cat /etc/hosts
127.0.0.1 localhost
84.x.x.x myhostname.mydomain.com myhostname
# ping myhostname
PING myhostname.mydomain.com (84.x.x.x) 56(84) bytes of data.
64 bytes from myhostname.mydomain.com (84.x.x.x): icmp_seq=1 ttl=64
time=0.017 ms
and when i do start-all.sh, the namenode seems running:
# netstat -tupl
Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address Foreign Address State
PID/Program name
tcp6 0 0 *:50070 *:* LISTEN
18241/java
tcp6 0 0 *:ssh *:* LISTEN
3350/sshd
tcp6 0 0 *:50010 *:* LISTEN
18279/java
and also I noticed that my nutch user (from who i launch all the scripts) is
not allowed to ping
nutch:~/search$ ping myhostname
ping: icmp open socket: Operation not permitted
but that shouldn't be linked to the fact to failed to open a java socket?
(java.net.ConnectException: Connection refused)
thanks for your help!
Dennis Kubes wrote:
>
> Make sure your hosts file on your namenode is setup correctly:
>
> 127.0.0.1 localhost.localdomain localhost
> 10.x.x.x myhostname.mydomain.com myhostname
>
> As opposed to:
>
> 127.0.0.1 localhost.localdomain localhost
> myhostname.mydomain.com myhostname
>
> The problem may be that the machine is listening on only the local
> interface. If you do a ping myhostname from the local box you should
> receive the real IP and not the loopback address.
>
> Let me know if this was the problem or if you need more help.
>
> Dennis Kubes
>
> cybercouf wrote:
>> I'm trying to setup hadoop using these guides:
>> http://wiki.apache.org/nutch/NutchHadoopTutorial and
>> http://www.nabble.com/Nutch-Step-by-Step-Maybe-someone-will-find-this-useful---tf3526281.html
>>
>> But i'm stuck at the early step: having a single machine running.
>> using nutch 0.8.1 and so the provided hadoop "hadoop-0.4.0-patched.jar"
>> JVM sun 1.5.0_11
>>
>> When I start the namenode (using ./bin/start-all.sh) I have this in the
>> namenode-log:
>>
>> 2007-05-02 12:39:51,335 INFO util.Credential - Checking Resource aliases
>> 2007-05-02 12:39:51,349 INFO http.HttpServer - Version Jetty/5.1.4
>> 2007-05-02 12:39:51,350 WARN servlet.WebApplicationContext - Web
>> application not found
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
>> 2007-05-02 12:39:51,351 WARN servlet.WebApplicationContext -
>> Configuration
>> error on
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
>> java.io.FileNotFoundException:
>> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
>> at
>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>> at
>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>> at org.mortbay.util.Container.start(Container.java:72)
>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>> at org.mortbay.util.Container.start(Container.java:72)
>> at
>> org.apache.hadoop.mapred.StatusHttpServer.start(StatusHttpServer.java:138)
>> at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:173)
>> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:91)
>> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:82)
>> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:491)
>> 2007-05-02 12:39:51,353 INFO util.Container - Started
>> HttpContext[/logs,/logs]
>> 2007-05-02 12:39:51,353 INFO util.Container - Started
>> HttpContext[/static,/static]
>> 2007-05-02 12:39:51,357 INFO http.SocketListener - Started
>> SocketListener
>> on 0.0.0.0:50070
>>
>> and after I can't access it:
>> $ ./bin/hadoop dfs -ls
>> ls: Connection refused
>>
>> hadoop.log:
>> 2007-05-02 12:41:40,030 WARN fs.DFSClient - Problem renewing lease for
>> DFSClient_2015604182: java.net.ConnectException: Connection refused
>> at java.net.PlainSocketImpl.socketConnect(Native Method)
>> [...]
>>
>>
>>
>> 1. I can't understant why there is this FileNotFound execption, I didn't
>> change anything in the hadoop nutch jar file.
>>
>> 2. It looks like the namenode is running (when I stop it I have the
>> message
>> "stopping namenode"), but why I can't access it ? (is this ip from the
>> log
>> correct? 0.0.0.0:50070)
>> all is on the same machine, and my conf file looks ok:
>> fs.default.name myhostname:9000
>> mapred.job.tracker myhostname:9001
>> mapred.map.tasks 2
>> mapred.reduce.tasks 2
>> dfs.name.dir /home/nutch/filesystem/name
>> dfs.data.dir /home/nutch/filesystem/data
>> mapred.system.dir /home/nutch/filesystem/mapreduce/system
>> mapred.local.dir /home/nutch/filesystem/mapreduce/local
>> dfs.replication 1
>
>
--
View this message in context: http://www.nabble.com/nutch-and-hadoop%3A-can%27t-launch-properly-the-name-node-tf3680311.html#a10286866
Sent from the Nutch - User mailing list archive at Nabble.com.
Re: nutch and hadoop: can't launch properly the name node
Posted by cybercouf <cy...@free.fr>.
Yes it is.
Here more details:
$ cat /etc/hosts
127.0.0.1 localhost
84.x.x.x myhostname.mydomain.com myhostname
# ping myhostname
PING myhostname.mydomain.com (84.x.x.x) 56(84) bytes of data.
64 bytes from myhostname.mydomain.com (84.x.x.x): icmp_seq=1 ttl=64
time=0.017 ms
and when i do start-all.sh, the namenode seems running:
# netstat -tupl
Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address Foreign Address State
PID/Program name
tcp6 0 0 *:50070 *:* LISTEN
18241/java
tcp6 0 0 *:ssh *:* LISTEN
3350/sshd
tcp6 0 0 *:50010 *:* LISTEN
18279/java
and also I noticed that my nutch user (from who i launch all the scripts) is
not allowed to ping
nutch:~/search$ ping myhostname
ping: icmp open socket: Operation not permitted
but that shouldn't be linked to the fact to failed to open a java socket?
(java.net.ConnectException: Connection refused)
thanks for your help!
Dennis Kubes wrote:
> Make sure your hosts file on your namenode is setup correctly:
>
> 127.0.0.1 localhost.localdomain localhost
> 10.x.x.x myhostname.mydomain.com myhostname
>
> As opposed to:
>
> 127.0.0.1 localhost.localdomain localhost
> myhostname.mydomain.com myhostname
>
> The problem may be that the machine is listening on only the local
> interface. If you do a ping myhostname from the local box you should
> receive the real IP and not the loopback address.
>
> Let me know if this was the problem or if you need more help.
>
> Dennis Kubes
>
> cybercouf wrote:
> > I'm trying to setup hadoop using these guides:
> > http://wiki.apache.org/nutch/NutchHadoopTutorial and
> >
> http://www.nabble.com/Nutch-Step-by-Step-Maybe-someone-will-find-this-useful---tf3526281.html
> >
> > But i'm stuck at the early step: having a single machine running.
> > using nutch 0.8.1 and so the provided hadoop "hadoop-0.4.0-patched.jar"
> > JVM sun 1.5.0_11
> >
> > When I start the namenode (using ./bin/start-all.sh) I have this in the
> > namenode-log:
> >
> > 2007-05-02 12:39:51,335 INFO util.Credential - Checking Resource
> aliases
> > 2007-05-02 12:39:51,349 INFO http.HttpServer - Version Jetty/5.1.4
> > 2007-05-02 12:39:51,350 WARN servlet.WebApplicationContext - Web
> > application not found
> >
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
> > 2007-05-02 12:39:51,351 WARN servlet.WebApplicationContext -
> Configuration
> > error on
> >
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
> > java.io.FileNotFoundException:
> >
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
> > at
> >
> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
> > at
> >
> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
> > at org.mortbay.util.Container.start(Container.java:72)
> > at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
> > at org.mortbay.util.Container.start(Container.java:72)
> > at
> >
> org.apache.hadoop.mapred.StatusHttpServer.start(StatusHttpServer.java:138)
> > at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:173)
> > at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:91)
> > at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:82)
> > at org.apache.hadoop.dfs.NameNode.main(NameNode.java:491)
> > 2007-05-02 12:39:51,353 INFO util.Container - Started
> > HttpContext[/logs,/logs]
> > 2007-05-02 12:39:51,353 INFO util.Container - Started
> > HttpContext[/static,/static]
> > 2007-05-02 12:39:51,357 INFO http.SocketListener - Started
> SocketListener
> > on 0.0.0.0:50070
> >
> > and after I can't access it:
> > $ ./bin/hadoop dfs -ls
> > ls: Connection refused
> >
> > hadoop.log:
> > 2007-05-02 12:41:40,030 WARN fs.DFSClient - Problem renewing lease for
> > DFSClient_2015604182: java.net.ConnectException: Connection refused
> > at java.net.PlainSocketImpl.socketConnect(Native Method)
> > [...]
> >
> >
> >
> > 1. I can't understant why there is this FileNotFound execption, I didn't
> > change anything in the hadoop nutch jar file.
> >
> > 2. It looks like the namenode is running (when I stop it I have the
> message
> > "stopping namenode"), but why I can't access it ? (is this ip from the
> log
> > correct? 0.0.0.0:50070)
> > all is on the same machine, and my conf file looks ok:
> > fs.default.name myhostname:9000
> > mapred.job.tracker myhostname:9001
> > mapred.map.tasks 2
> > mapred.reduce.tasks 2
> > dfs.name.dir /home/nutch/filesystem/name
> > dfs.data.dir /home/nutch/filesystem/data
> > mapred.system.dir /home/nutch/filesystem/mapreduce/system
> > mapred.local.dir /home/nutch/filesystem/mapreduce/local
> > dfs.replication 1
--
View this message in context: http://www.nabble.com/nutch-and-hadoop%3A-can%27t-launch-properly-the-name-node-tf3680311.html#a10286866
Sent from the Nutch - User mailing list archive at Nabble.com.
Re: nutch and hadoop: can't launch properly the name node
Posted by Dennis Kubes <nu...@dragonflymc.com>.
Make sure your hosts file on your namenode is setup correctly:
127.0.0.1 localhost.localdomain localhost
10.x.x.x myhostname.mydomain.com myhostname
As opposed to:
127.0.0.1 localhost.localdomain localhost
myhostname.mydomain.com myhostname
The problem may be that the machine is listening on only the local
interface. If you do a ping myhostname from the local box you should
receive the real IP and not the loopback address.
Let me know if this was the problem or if you need more help.
Dennis Kubes
cybercouf wrote:
> I'm trying to setup hadoop using these guides:
> http://wiki.apache.org/nutch/NutchHadoopTutorial and
> http://www.nabble.com/Nutch-Step-by-Step-Maybe-someone-will-find-this-useful---tf3526281.html
>
> But i'm stuck at the early step: having a single machine running.
> using nutch 0.8.1 and so the provided hadoop "hadoop-0.4.0-patched.jar"
> JVM sun 1.5.0_11
>
> When I start the namenode (using ./bin/start-all.sh) I have this in the
> namenode-log:
>
> 2007-05-02 12:39:51,335 INFO util.Credential - Checking Resource aliases
> 2007-05-02 12:39:51,349 INFO http.HttpServer - Version Jetty/5.1.4
> 2007-05-02 12:39:51,350 WARN servlet.WebApplicationContext - Web
> application not found
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
> 2007-05-02 12:39:51,351 WARN servlet.WebApplicationContext - Configuration
> error on
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
> java.io.FileNotFoundException:
> /home/nutch/search/file:/home/nutch/search/lib/hadoop-0.4.0-patched.jar!/webapps/dfs
> at
> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
> at
> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
> at org.mortbay.util.Container.start(Container.java:72)
> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
> at org.mortbay.util.Container.start(Container.java:72)
> at
> org.apache.hadoop.mapred.StatusHttpServer.start(StatusHttpServer.java:138)
> at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:173)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:91)
> at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:82)
> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:491)
> 2007-05-02 12:39:51,353 INFO util.Container - Started
> HttpContext[/logs,/logs]
> 2007-05-02 12:39:51,353 INFO util.Container - Started
> HttpContext[/static,/static]
> 2007-05-02 12:39:51,357 INFO http.SocketListener - Started SocketListener
> on 0.0.0.0:50070
>
> and after I can't access it:
> $ ./bin/hadoop dfs -ls
> ls: Connection refused
>
> hadoop.log:
> 2007-05-02 12:41:40,030 WARN fs.DFSClient - Problem renewing lease for
> DFSClient_2015604182: java.net.ConnectException: Connection refused
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> [...]
>
>
>
> 1. I can't understant why there is this FileNotFound execption, I didn't
> change anything in the hadoop nutch jar file.
>
> 2. It looks like the namenode is running (when I stop it I have the message
> "stopping namenode"), but why I can't access it ? (is this ip from the log
> correct? 0.0.0.0:50070)
> all is on the same machine, and my conf file looks ok:
> fs.default.name myhostname:9000
> mapred.job.tracker myhostname:9001
> mapred.map.tasks 2
> mapred.reduce.tasks 2
> dfs.name.dir /home/nutch/filesystem/name
> dfs.data.dir /home/nutch/filesystem/data
> mapred.system.dir /home/nutch/filesystem/mapreduce/system
> mapred.local.dir /home/nutch/filesystem/mapreduce/local
> dfs.replication 1