You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by "Joe Prasanna Kumar (JIRA)" <ji...@apache.org> on 2010/08/18 03:22:16 UTC
[jira] Updated: (MAHOUT-482) Defaulting $HADOOP_CONF_DIR to
$HADOOP_HOME/conf
[ https://issues.apache.org/jira/browse/MAHOUT-482?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Joe Prasanna Kumar updated MAHOUT-482:
--------------------------------------
Attachment: MAHOUT-482.patch
Contains the patch as described in the issue description
> Defaulting $HADOOP_CONF_DIR to $HADOOP_HOME/conf
> ------------------------------------------------
>
> Key: MAHOUT-482
> URL: https://issues.apache.org/jira/browse/MAHOUT-482
> Project: Mahout
> Issue Type: Bug
> Reporter: Joe Prasanna Kumar
> Priority: Trivial
> Attachments: MAHOUT-482.patch
>
> Original Estimate: 0.33h
> Remaining Estimate: 0.33h
>
> Hi all,
> while trying to understand the clusterdump utility, I encountered a message "no HADOOP_CONF_DIR or HADOOP_HOME set, running locally". As per some of the discussions in Apr'10 (http://lucene.472066.n3.nabble.com/Dealing-with-kmean-and-meanshift-output-td708824.html#a709066), it would be good to default the hadoop conf directory to hadoop_home/conf.
> So I've changed the script to accommodate this.
> Existing code:
> if [ "$HADOOP_CONF_DIR" = "" ] || [ "$HADOOP_HOME" = "" ] || [ "$MAHOUT_LOCAL" != "" ] ; then
> if [ "$HADOOP_CONF_DIR" = "" ] || [ "$HADOOP_HOME" = "" ] ; then
> echo "no HADOOP_CONF_DIR or HADOOP_HOME set, running locally"
> elif [ "$MAHOUT_LOCAL" != "" ] ; then
> echo "MAHOUT_LOCAL is set, running locally"
> fi
> exec "$JAVA" $JAVA_HEAP_MAX $MAHOUT_OPTS -classpath "$CLASSPATH" $CLASS "$@"
> else
> echo "running on hadoop, using HADOOP_HOME=$HADOOP_HOME and HADOOP_CONF_DIR=$HADOOP_CONF_DIR"
>
> New code:
> if [ "$HADOOP_HOME" = "" ] || [ "$MAHOUT_LOCAL" != "" ] ; then
> if [ "$HADOOP_HOME" = "" ] ; then
> echo "no HADOOP_HOME set, running locally"
> elif [ "$MAHOUT_LOCAL" != "" ] ; then
> echo "MAHOUT_LOCAL is set, running locally"
> fi
> exec "$JAVA" $JAVA_HEAP_MAX $MAHOUT_OPTS -classpath "$CLASSPATH" $CLASS "$@"
> else
> if [ "$HADOOP_CONF_DIR" = "" ] ; then
> HADOOP_CONF_DIR=$HADOOP_HOME/conf
> echo "no HADOOP_CONF_DIR set. so defaulting HADOOP_CONF_DIR to $HADOOP_HOME/conf "
> fi
> echo "running on hadoop, using HADOOP_HOME=$HADOOP_HOME and HADOOP_CONF_DIR=$HADOOP_CONF_DIR"
> In my local machine, I've just set HADOOP_HOME and the script defaults the $HADOOP_CONF_DIR directory and runs on hadoop.
> If this issue / fix looks valid, I'll open a JIRA issue and attach the patch.
> Appreciate your thoughts / feedbacks,
> Joe.
> Drew Farris to dev
> show details 8:53 AM (12 hours ago)
> On Tue, Aug 17, 2010 at 12:26 AM, Joe Kumar <jo...@gmail.com> wrote:
> > Hi all,
> >
> > while trying to understand the clusterdump utility, I encountered a message
> > "no HADOOP_CONF_DIR or HADOOP_HOME set, running locally". As per some of the
> > discussions in Apr'10 (
> > http://lucene.472066.n3.nabble.com/Dealing-with-kmean-and-meanshift-output-td708824.html#a709066),
> > it would be good to default the hadoop conf directory to hadoop_home/conf.
> Agreed, and your solution looks good. I'll keep an eye out for the patch.
> Jake Mannix to dev
> show details 3:46 PM (5 hours ago)
> +1
> Good idea. I thought we already did this...
> -jake
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.