You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Saptarshi Guha <sa...@gmail.com> on 2009/04/01 20:29:13 UTC

Building LZO on hadoop

I checked out hadoop-core-0.19
export CFLAGS=$CUSTROOT/include
export LDFLAGS=$CUSTROOT/lib

(they contain lzo which was built with --shared)
>ls $CUSTROOT/include/lzo/
lzo1a.h  lzo1b.h  lzo1c.h  lzo1f.h  lzo1.h  lzo1x.h  lzo1y.h  lzo1z.h
lzo2a.h  lzo_asm.h  lzoconf.h  lzodefs.h  lzoutil.h

>ls $CUSTROOT/lib/
liblzo2.so  liblzo.a  liblzo.la  liblzo.so  liblzo.so.1  liblzo.so.2
liblzo.so.2.0.0

I then run (from hadoop-core-0.19.1/)
ant -Dcompile.native=true

I get messages like : (many others like this)
exec] configure: WARNING: lzo/lzo1x.h: accepted by the compiler,
rejected by the preprocessor!
     [exec] configure: WARNING: lzo/lzo1x.h: proceeding with the
compiler's result
     [exec] checking for lzo/lzo1x.h... yes
     [exec] checking Checking for the 'actual' dynamic-library for
'-llzo2'... (cached)
     [exec] checking lzo/lzo1y.h usability... yes
     [exec] checking lzo/lzo1y.h presence... no
     [exec] configure: WARNING: lzo/lzo1y.h: accepted by the compiler,
rejected by the preprocessor!
     [exec] configure: WARNING: lzo/lzo1y.h: proceeding with the
compiler's result
     [exec] checking for lzo/lzo1y.h... yes
     [exec] checking Checking for the 'actual' dynamic-library for
'-llzo2'... (cached)

and finally,
ive/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c  -fPIC -DPIC
-o .libs/LzoCompressor.o
     [exec] /ln/meraki/custom/hadoop-core-0.19.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:
In function 'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
     [exec] /ln/meraki/custom/hadoop-core-0.19.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:137:
error: expected expression before ',' token


Any ideas?
Saptarshi Guha

Re: job status from command prompt

Posted by Amareshwari Sriramadasu <am...@yahoo-inc.com>.
Elia Mazzawi wrote:
> is there a command that i can run from the shell that says this job 
> passed / failed
>
> I found these but they don't really say pass/fail they only say what 
> is running and percent complete.
>
> this shows what is running
> ./hadoop job -list
>
> and this shows the completion
> ./hadoop job -status job_200903061521_0045
The following command lists all jobs in prep, running, completed:
./hadoop job -list all

-Amareshwari


job status from command prompt

Posted by Elia Mazzawi <el...@casalemedia.com>.
is there a command that i can run from the shell that says this job 
passed / failed

I found these but they don't really say pass/fail they only say what is 
running and percent complete.

this shows what is running
./hadoop job -list

and this shows the completion
./hadoop job -status job_200903061521_0045

Re: hadoop job controller

Posted by Stefan Podkowinski <sp...@gmail.com>.
You can get the job progress and completion status through an instance
of org.apache.hadoop.mapred.JobClient . If you really want to use perl
I guess you still need to write a small java application that talks to
perl and JobClient on the other side.
Theres also some support for Thrift in the hadoop contrib package, but
I'm not sure if it exposes any job client related methods.

On Thu, Apr 2, 2009 at 12:46 AM, Elia Mazzawi
<el...@casalemedia.com> wrote:
>
> I'm writing a perl program to submit jobs to the cluster,
> then wait for the jobs to finish, and check that they have completed
> successfully.
>
> I have some questions,
>
> this shows what is running
> ./hadoop job  -list
>
> and this shows the completion
> ./hadoop job -status  job_200903061521_0045
>
>
> but i want something that just says pass / fail
> cause with these, i have to check that its done then check that its 100%
> completed.
>
> which must exist since the webapp jobtracker.jsp knows what is what.
>
> also a controller like that must have been written many times already,  are
> there any around?
>
> Regards,
> Elia
>

hadoop job controller

Posted by Elia Mazzawi <el...@casalemedia.com>.
I'm writing a perl program to submit jobs to the cluster,
then wait for the jobs to finish, and check that they have completed 
successfully.

I have some questions,

this shows what is running
./hadoop job  -list

and this shows the completion
./hadoop job -status  job_200903061521_0045


but i want something that just says pass / fail
cause with these, i have to check that its done then check that its 100% 
completed.

which must exist since the webapp jobtracker.jsp knows what is what.

also a controller like that must have been written many times already,  
are there any around?

Regards,
Elia

Re: Building LZO on hadoop

Posted by Saptarshi Guha <sa...@gmail.com>.
Actually, if one installs the latest liblzo and sets CFLAGS, LDFLAGS
and LFLAGS correctly, things work fine.
Saptarshi Guha



On Wed, Apr 1, 2009 at 3:55 PM, Saptarshi Guha <sa...@gmail.com> wrote:
> Fixed. In the configure script src/native/
> change
>  echo 'int main(int argc, char **argv){return 0;}' > conftest.c
>  if test -z "`${CC} ${LDFLAGS} -o conftest conftest.c -llzo2 2>&1`"; then
>        if test ! -z "`which objdump | grep -v 'no objdump'`"; then
>      ac_cv_libname_lzo2="`objdump -p conftest | grep NEEDED | grep
> lzo2 | sed 's/\W*NEEDED\W*\(.*\)\W*$/\"\1\"/'`"
>    elif test ! -z "`which ldd | grep -v 'no ldd'`"; then
>      ac_cv_libname_lzo2="`ldd conftest | grep lzo2 | sed
> 's/^[^A-Za-z0-9]*\([A-Za-z0-9\.]*\)[^A-Za-z0-9]*=>.*$/\"\1\"/'`"
>    else
>      { { echo "$as_me:$LINENO: error: Can't find either 'objdump' or
> 'ldd' to compute the dynamic library for '-llzo2'" >&5
> echo "$as_me: error: Can't find either 'objdump' or 'ldd' to compute
> the dynamic library for '-llzo2'" >&2;}
>   { (exit 1); exit 1; }; }
>    fi
>  else
>    ac_cv_libname_lzo2=libnotfound.so
>  fi
>  rm -f conftest*
>
> lzo2 to lzo.so.2 (again this depends on what the user has), also set
> CFLAGS and LDFLAGS to include your lzo libs/incs
>
>
>
> Saptarshi Guha
>
>
>
> On Wed, Apr 1, 2009 at 2:29 PM, Saptarshi Guha <sa...@gmail.com> wrote:
>> I checked out hadoop-core-0.19
>> export CFLAGS=$CUSTROOT/include
>> export LDFLAGS=$CUSTROOT/lib
>>
>> (they contain lzo which was built with --shared)
>>>ls $CUSTROOT/include/lzo/
>> lzo1a.h  lzo1b.h  lzo1c.h  lzo1f.h  lzo1.h  lzo1x.h  lzo1y.h  lzo1z.h
>> lzo2a.h  lzo_asm.h  lzoconf.h  lzodefs.h  lzoutil.h
>>
>>>ls $CUSTROOT/lib/
>> liblzo2.so  liblzo.a  liblzo.la  liblzo.so  liblzo.so.1  liblzo.so.2
>> liblzo.so.2.0.0
>>
>> I then run (from hadoop-core-0.19.1/)
>> ant -Dcompile.native=true
>>
>> I get messages like : (many others like this)
>> exec] configure: WARNING: lzo/lzo1x.h: accepted by the compiler,
>> rejected by the preprocessor!
>>     [exec] configure: WARNING: lzo/lzo1x.h: proceeding with the
>> compiler's result
>>     [exec] checking for lzo/lzo1x.h... yes
>>     [exec] checking Checking for the 'actual' dynamic-library for
>> '-llzo2'... (cached)
>>     [exec] checking lzo/lzo1y.h usability... yes
>>     [exec] checking lzo/lzo1y.h presence... no
>>     [exec] configure: WARNING: lzo/lzo1y.h: accepted by the compiler,
>> rejected by the preprocessor!
>>     [exec] configure: WARNING: lzo/lzo1y.h: proceeding with the
>> compiler's result
>>     [exec] checking for lzo/lzo1y.h... yes
>>     [exec] checking Checking for the 'actual' dynamic-library for
>> '-llzo2'... (cached)
>>
>> and finally,
>> ive/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c  -fPIC -DPIC
>> -o .libs/LzoCompressor.o
>>     [exec] /ln/meraki/custom/hadoop-core-0.19.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:
>> In function 'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
>>     [exec] /ln/meraki/custom/hadoop-core-0.19.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:137:
>> error: expected expression before ',' token
>>
>>
>> Any ideas?
>> Saptarshi Guha
>>
>

Re: Building LZO on hadoop

Posted by Saptarshi Guha <sa...@gmail.com>.
Fixed. In the configure script src/native/
change
 echo 'int main(int argc, char **argv){return 0;}' > conftest.c
  if test -z "`${CC} ${LDFLAGS} -o conftest conftest.c -llzo2 2>&1`"; then
        if test ! -z "`which objdump | grep -v 'no objdump'`"; then
      ac_cv_libname_lzo2="`objdump -p conftest | grep NEEDED | grep
lzo2 | sed 's/\W*NEEDED\W*\(.*\)\W*$/\"\1\"/'`"
    elif test ! -z "`which ldd | grep -v 'no ldd'`"; then
      ac_cv_libname_lzo2="`ldd conftest | grep lzo2 | sed
's/^[^A-Za-z0-9]*\([A-Za-z0-9\.]*\)[^A-Za-z0-9]*=>.*$/\"\1\"/'`"
    else
      { { echo "$as_me:$LINENO: error: Can't find either 'objdump' or
'ldd' to compute the dynamic library for '-llzo2'" >&5
echo "$as_me: error: Can't find either 'objdump' or 'ldd' to compute
the dynamic library for '-llzo2'" >&2;}
   { (exit 1); exit 1; }; }
    fi
  else
    ac_cv_libname_lzo2=libnotfound.so
  fi
  rm -f conftest*

lzo2 to lzo.so.2 (again this depends on what the user has), also set
CFLAGS and LDFLAGS to include your lzo libs/incs



Saptarshi Guha



On Wed, Apr 1, 2009 at 2:29 PM, Saptarshi Guha <sa...@gmail.com> wrote:
> I checked out hadoop-core-0.19
> export CFLAGS=$CUSTROOT/include
> export LDFLAGS=$CUSTROOT/lib
>
> (they contain lzo which was built with --shared)
>>ls $CUSTROOT/include/lzo/
> lzo1a.h  lzo1b.h  lzo1c.h  lzo1f.h  lzo1.h  lzo1x.h  lzo1y.h  lzo1z.h
> lzo2a.h  lzo_asm.h  lzoconf.h  lzodefs.h  lzoutil.h
>
>>ls $CUSTROOT/lib/
> liblzo2.so  liblzo.a  liblzo.la  liblzo.so  liblzo.so.1  liblzo.so.2
> liblzo.so.2.0.0
>
> I then run (from hadoop-core-0.19.1/)
> ant -Dcompile.native=true
>
> I get messages like : (many others like this)
> exec] configure: WARNING: lzo/lzo1x.h: accepted by the compiler,
> rejected by the preprocessor!
>     [exec] configure: WARNING: lzo/lzo1x.h: proceeding with the
> compiler's result
>     [exec] checking for lzo/lzo1x.h... yes
>     [exec] checking Checking for the 'actual' dynamic-library for
> '-llzo2'... (cached)
>     [exec] checking lzo/lzo1y.h usability... yes
>     [exec] checking lzo/lzo1y.h presence... no
>     [exec] configure: WARNING: lzo/lzo1y.h: accepted by the compiler,
> rejected by the preprocessor!
>     [exec] configure: WARNING: lzo/lzo1y.h: proceeding with the
> compiler's result
>     [exec] checking for lzo/lzo1y.h... yes
>     [exec] checking Checking for the 'actual' dynamic-library for
> '-llzo2'... (cached)
>
> and finally,
> ive/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c  -fPIC -DPIC
> -o .libs/LzoCompressor.o
>     [exec] /ln/meraki/custom/hadoop-core-0.19.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:
> In function 'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
>     [exec] /ln/meraki/custom/hadoop-core-0.19.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:137:
> error: expected expression before ',' token
>
>
> Any ideas?
> Saptarshi Guha
>