You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by seven garfee <ga...@gmail.com> on 2011/10/14 09:26:55 UTC
debug script not found
hi,guys
I'm debuging pipes program on mapreduce, and trying debug script to print
some debug info.
I used the default pipes script under src/c++/pips/debug, and put it on
hdfs, create a symlink in distcache
But it does not work while printing "exec: pipes-default-script : no
found".
Does that message printed mean pipes-default-script is not under the
task's work dir?
I print the work dir as following:
attempt_201110141512_0001_m_000000_1
│ ├── job.xml
│ ├── split.info
│ └── work
│ ├── jobTokenPassword
│ ├── pipes-default-script ->
/home/test/hadoop/mapred_tmp/taskTracker/distcache/4497768195926406270_-287867232_17995480/localhost/resource/pipes-default-script
.....
pipes-default-script is under the work dir.
SO what's the solution ?
Re: debug script not found
Posted by seven garfee <ga...@gmail.com>.
I change to hadoop-0.20.2 from cloudera-chd3,then debug script works,Except
that the script don't have "$program" arg as said "
$script $stdout $stderr $syslog $jobconf $program"
2011/10/14 seven garfee <ga...@gmail.com>
> hi,guys
> I'm debuging pipes program on mapreduce, and trying debug script to print
> some debug info.
>
> I used the default pipes script under src/c++/pips/debug, and put it on
> hdfs, create a symlink in distcache
>
> But it does not work while printing "exec: pipes-default-script : no
> found".
> Does that message printed mean pipes-default-script is not under the
> task's work dir?
> I print the work dir as following:
>
> attempt_201110141512_0001_m_000000_1
> │ ├── job.xml
> │ ├── split.info
> │ └── work
> │ ├── jobTokenPassword
> │ ├── pipes-default-script ->
> /home/test/hadoop/mapred_tmp/taskTracker/distcache/4497768195926406270_-287867232_17995480/localhost/resource/pipes-default-script
> .....
> pipes-default-script is under the work dir.
>
> SO what's the solution ?
>
>