You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Kunal Gupta <ku...@techlead-india.com> on 2009/12/08 13:14:23 UTC

Outputs to stdout and stderr not visible

I am outputting a message to stdout and stderr from inside my MAP
function.
but the stdout and stderr files in log folder are empty. I tried looking
these logfiles from the web UI as well. They are empty only.

Why are these messages not getting printed?
Alternately, where can i print messages for debugging my MR
application..?


Re: Outputs to stdout and stderr not visible

Posted by Ed Mazur <ma...@cs.umass.edu>.
Not sure why the stdout/stderr files in $HADOOP_LOG_DIR/userlogs/xxx
would be empty, but an alternative that I use is the log4j framework
used by Hadoop. You can instantiate it like this:

private static final Log LOG = LogFactory.getLog(YourClass.class);

And then use it like this:

LOG.info("...");
LOG.debug("...");
etc.

If you use this in your map/reduce class, the messages will be in the
logs of individual tasks.

Ed

On Tue, Dec 8, 2009 at 7:14 AM, Kunal Gupta <ku...@techlead-india.com> wrote:
> I am outputting a message to stdout and stderr from inside my MAP
> function.
> but the stdout and stderr files in log folder are empty. I tried looking
> these logfiles from the web UI as well. They are empty only.
>
> Why are these messages not getting printed?
> Alternately, where can i print messages for debugging my MR
> application..?
>
>