You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "stack@archive.org (JIRA)" <ji...@apache.org> on 2006/02/23 19:43:37 UTC
[jira] Created: (HADOOP-58) Hadoop requires configuration of
hadoop-site.xml or won't run
Hadoop requires configuration of hadoop-site.xml or won't run
-------------------------------------------------------------
Key: HADOOP-58
URL: http://issues.apache.org/jira/browse/HADOOP-58
Project: Hadoop
Type: Bug
Reporter: stack@archive.org
Priority: Minor
On a new install, I would expect '${HADOOP_HOME}/bin/start-all.sh" to bring up a basic instance, one that is using local filesystem (Or if not, then uses a DFS homed in localhost:/tmp) and that has all four daemons running on localhost. Currently this is not the case. Hadoop complains 'java.lang.RuntimeException: Not a host:port pair: local'. It doesn't like the 'local' default value for mapred.job.tracker and fs.default.name properties.
Revision: 379930
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira
[jira] Updated: (HADOOP-58) Hadoop requires configuration of
hadoop-site.xml or won't run
Posted by "stack@archive.org (JIRA)" <ji...@apache.org>.
[ http://issues.apache.org/jira/browse/HADOOP-58?page=all ]
stack@archive.org updated HADOOP-58:
------------------------------------
Attachment: local2localhostPort.patch
Suggested patch that makes localhost and DFS the default. Local filesystem might be better as default but looks like more work needed making 'local' work again.
> Hadoop requires configuration of hadoop-site.xml or won't run
> -------------------------------------------------------------
>
> Key: HADOOP-58
> URL: http://issues.apache.org/jira/browse/HADOOP-58
> Project: Hadoop
> Type: Bug
> Reporter: stack@archive.org
> Priority: Minor
> Attachments: local2localhostPort.patch
>
> On a new install, I would expect '${HADOOP_HOME}/bin/start-all.sh" to bring up a basic instance, one that is using local filesystem (Or if not, then uses a DFS homed in localhost:/tmp) and that has all four daemons running on localhost. Currently this is not the case. Hadoop complains 'java.lang.RuntimeException: Not a host:port pair: local'. It doesn't like the 'local' default value for mapred.job.tracker and fs.default.name properties.
> Revision: 379930
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira
[jira] Commented: (HADOOP-58) Hadoop requires configuration of
hadoop-site.xml or won't run
Posted by "Doug Cutting (JIRA)" <ji...@apache.org>.
[ http://issues.apache.org/jira/browse/HADOOP-58?page=comments#action_12367568 ]
Doug Cutting commented on HADOOP-58:
------------------------------------
I don't think we want things to use DFS and TaskTracker/JobTracker by default, since this slows things down and uses more resources than needed when running on a single node.
The javadoc provides a recommended configuration for "pseudo-distributed" use:
http://lucene.apache.org/hadoop/docs/api/overview-summary.html
With that in place, bin/start-all.sh works fine, no?
> Hadoop requires configuration of hadoop-site.xml or won't run
> -------------------------------------------------------------
>
> Key: HADOOP-58
> URL: http://issues.apache.org/jira/browse/HADOOP-58
> Project: Hadoop
> Type: Bug
> Reporter: stack@archive.org
> Priority: Minor
> Attachments: local2localhostPort.patch
>
> On a new install, I would expect '${HADOOP_HOME}/bin/start-all.sh" to bring up a basic instance, one that is using local filesystem (Or if not, then uses a DFS homed in localhost:/tmp) and that has all four daemons running on localhost. Currently this is not the case. Hadoop complains 'java.lang.RuntimeException: Not a host:port pair: local'. It doesn't like the 'local' default value for mapred.job.tracker and fs.default.name properties.
> Revision: 379930
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira
[jira] Commented: (HADOOP-58) Hadoop requires configuration of
hadoop-site.xml or won't run
Posted by "stack@archive.org (JIRA)" <ji...@apache.org>.
[ http://issues.apache.org/jira/browse/HADOOP-58?page=comments#action_12367572 ]
stack@archive.org commented on HADOOP-58:
-----------------------------------------
Feels like the 'pseudo-distributed' additions to hadoop-site.xml should be default config. in hadoop-default.xml but then I suppose that would preclude 'standalone operation'.
Thanks for pointer to the doc. It clarifies how things are meant to work.
Please close this issue.
> Hadoop requires configuration of hadoop-site.xml or won't run
> -------------------------------------------------------------
>
> Key: HADOOP-58
> URL: http://issues.apache.org/jira/browse/HADOOP-58
> Project: Hadoop
> Type: Bug
> Reporter: stack@archive.org
> Priority: Minor
> Attachments: local2localhostPort.patch
>
> On a new install, I would expect '${HADOOP_HOME}/bin/start-all.sh" to bring up a basic instance, one that is using local filesystem (Or if not, then uses a DFS homed in localhost:/tmp) and that has all four daemons running on localhost. Currently this is not the case. Hadoop complains 'java.lang.RuntimeException: Not a host:port pair: local'. It doesn't like the 'local' default value for mapred.job.tracker and fs.default.name properties.
> Revision: 379930
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira
[jira] Closed: (HADOOP-58) Hadoop requires configuration of
hadoop-site.xml or won't run
Posted by "Doug Cutting (JIRA)" <ji...@apache.org>.
[ http://issues.apache.org/jira/browse/HADOOP-58?page=all ]
Doug Cutting closed HADOOP-58:
------------------------------
Resolution: Won't Fix
I'm closing this. The default configuration supports standalone, in-process operation, and there's a well-documented way to achieve standalone multi-process operation (aka "pseudo distributed').
> Hadoop requires configuration of hadoop-site.xml or won't run
> -------------------------------------------------------------
>
> Key: HADOOP-58
> URL: http://issues.apache.org/jira/browse/HADOOP-58
> Project: Hadoop
> Type: Bug
> Reporter: stack@archive.org
> Priority: Minor
> Attachments: local2localhostPort.patch
>
> On a new install, I would expect '${HADOOP_HOME}/bin/start-all.sh" to bring up a basic instance, one that is using local filesystem (Or if not, then uses a DFS homed in localhost:/tmp) and that has all four daemons running on localhost. Currently this is not the case. Hadoop complains 'java.lang.RuntimeException: Not a host:port pair: local'. It doesn't like the 'local' default value for mapred.job.tracker and fs.default.name properties.
> Revision: 379930
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira