You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Allen Wittenauer (JIRA)" <ji...@apache.org> on 2015/01/02 20:04:34 UTC
[jira] [Commented] (HADOOP-11058) Missing HADOOP_CONF_DIR generates
strange results
[ https://issues.apache.org/jira/browse/HADOOP-11058?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14263131#comment-14263131 ]
Allen Wittenauer commented on HADOOP-11058:
-------------------------------------------
OK, upon further playing, this doesn't cover one particularly interesting case. If log4j.settings is missing, one can't reset the log level either. So doing something like:
{code}
$ hdfs --loglevel DEBUG namenode
{code}
(note the lack of --daemon!) doesn't work as expected.
This also means that it won't work for interactive commands as well. :(
I think this if statement needs to get pushed into hadoop_basic_init so that we catch all the cases.
> Missing HADOOP_CONF_DIR generates strange results
> -------------------------------------------------
>
> Key: HADOOP-11058
> URL: https://issues.apache.org/jira/browse/HADOOP-11058
> Project: Hadoop Common
> Issue Type: Improvement
> Components: scripts
> Reporter: Allen Wittenauer
> Assignee: Masatake Iwasaki
> Labels: newbie
> Attachments: HADOOP-11058.1.patch, HADOOP-11058.2.patch
>
>
> If HADOOP_CONF_DIR is defined but points to a directory that either doesn't exist or isn't actually a viable configuration directory, all sorts of weird things happen, especially for logging. The shell code should do a better job of verifying the directory is valid and exit if it detects if it is broken in some way.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)