You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Yongzhi Wang <wa...@gmail.com> on 2012/09/18 05:04:38 UTC
No 32-bit taskcontroller on Hadoop 1.0.3
Dear All,
I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
think need a 32-bit binary file taskcontroller. However, I found the
binary
files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
build file from server jenkins
(https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
It's still a 64 bit file.
I got the following errors when I start task tracker using the hadoop
64-bit taskcontroller:
12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
because java.io.IOException: Task controller setup failed because of
invalidpermissions/ownership with exit code 126
at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
/opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
/opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
binary file
at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
I am wondering if not providing 32-bit of taskcontroller is a build
bug, or 64-bit taskcontroller can be used somehow on the 32-bit
platform? If no 32-bit executable is provided in the daily build of
hadoop, how can I build one by myself?
Thanks!
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Arpit Gupta <ar...@hortonworks.com>.
Take a look at
http://hadoop.apache.org/docs/r1.0.3/cluster_setup.html
and look for 'Using the LinuxTaskController'
It has the info on what the permission and ownership of the task controller executable should be.
--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/
On Sep 17, 2012, at 8:04 PM, Yongzhi Wang <wa...@gmail.com> wrote:
> Dear All,
>
> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
> think need a 32-bit binary file taskcontroller. However, I found the
> binary
> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
> build file from server jenkins
> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
> It's still a 64 bit file.
>
> I got the following errors when I start task tracker using the hadoop
> 64-bit taskcontroller:
>
> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
> because java.io.IOException: Task controller setup failed because of
> invalidpermissions/ownership with exit code 126
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
> binary file
>
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
> at org.apache.hadoop.util.Shell.run(Shell.java:182)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>
> I am wondering if not providing 32-bit of taskcontroller is a build
> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
> platform? If no 32-bit executable is provided in the daily build of
> hadoop, how can I build one by myself?
>
> Thanks!
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Arpit Gupta <ar...@hortonworks.com>.
Take a look at
http://hadoop.apache.org/docs/r1.0.3/cluster_setup.html
and look for 'Using the LinuxTaskController'
It has the info on what the permission and ownership of the task controller executable should be.
--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/
On Sep 17, 2012, at 8:04 PM, Yongzhi Wang <wa...@gmail.com> wrote:
> Dear All,
>
> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
> think need a 32-bit binary file taskcontroller. However, I found the
> binary
> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
> build file from server jenkins
> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
> It's still a 64 bit file.
>
> I got the following errors when I start task tracker using the hadoop
> 64-bit taskcontroller:
>
> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
> because java.io.IOException: Task controller setup failed because of
> invalidpermissions/ownership with exit code 126
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
> binary file
>
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
> at org.apache.hadoop.util.Shell.run(Shell.java:182)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>
> I am wondering if not providing 32-bit of taskcontroller is a build
> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
> platform? If no 32-bit executable is provided in the daily build of
> hadoop, how can I build one by myself?
>
> Thanks!
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Harsh J <ha...@cloudera.com>.
Thanks for following up with such details Yongzhi, this will come
useful to others! Good to know you are now unblocked as well.
On Wed, Sep 19, 2012 at 9:12 AM, Yongzhi Wang
<wa...@gmail.com> wrote:
> Hi, Harsh
>
> Thanks for your suggestion. It works!
>
> However, you will met an error while building the taskcontroller:
>
> [exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used
> with -D_FILE_OFFSET_BITS==64"
>
> This is a known bug, you can find more information in
> https://issues.apache.org/jira/browse/MAPREDUCE-2178
>
> The solution is as follows:
>
> Another amendment: AC_SYS_LARGEFILE in configure.ac causes a build
> failure on 32-bit RHEL5:
>
> [exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be
> used with -D_FILE_OFFSET_BITS==64"
>
> Since the task-controller doesn't need to deal with very large
> (>2G) files, I don't think this flag is necessary. (it just copies
> job.xml, tokens, and taskjvm.sh)
>
> Thank you very much!
> Yongzhi
>
> On Mon, Sep 17, 2012 at 11:38 PM, Harsh J <ha...@cloudera.com> wrote:
>> Hey Yongzhi,
>>
>> You'll need to build it manually on your 32-bit machine. Run the build
>> from your $HADOOP_HOME/$HADOOP_PREFIX as:
>>
>> (Depending on your version you may or may not require jdk5, we had
>> removed that dependency recently. The rest of the tools mentioned, you
>> should require for a full tar generation.)
>>
>> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
>> -Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
>> -Dxercescroot=$HOME/tools/xerces/latest
>> -Declipse.home=$HOME/tools/eclipse/latest
>> -Djava5.home=$HOME/tools/java5/latest
>> -Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
>> -Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
>> create-c++-configure task-controller tar
>>
>> This should build a fully formed tarball that should have 32-bit
>> natives in it. Use this tarball to deploy your cluster.
>>
>> Or if you just want the task-controller and the natives for your arch,
>> do (Shouldn't require any tool dependencies):
>>
>> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
>> -Dlibrecordio=true veryclean clean compile create-c++-configure
>> task-controller
>>
>> Alternatively, I encourage looking at using packages for your arch,
>> and at the Apache Bigtop (incubating) project:
>> http://incubator.apache.org/bigtop
>>
>> On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
>> <wa...@gmail.com> wrote:
>>> Dear All,
>>>
>>> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
>>> think need a 32-bit binary file taskcontroller. However, I found the
>>> binary
>>> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
>>> build file from server jenkins
>>> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
>>> It's still a 64 bit file.
>>>
>>> I got the following errors when I start task tracker using the hadoop
>>> 64-bit taskcontroller:
>>>
>>> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
>>> because java.io.IOException: Task controller setup failed because of
>>> invalidpermissions/ownership with exit code 126
>>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
>>> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
>>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
>>> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
>>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
>>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
>>> binary file
>>>
>>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>>> at org.apache.hadoop.util.Shell.run(Shell.java:182)
>>> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>>>
>>> I am wondering if not providing 32-bit of taskcontroller is a build
>>> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
>>> platform? If no 32-bit executable is provided in the daily build of
>>> hadoop, how can I build one by myself?
>>>
>>> Thanks!
>>
>>
>>
>> --
>> Harsh J
--
Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Harsh J <ha...@cloudera.com>.
Thanks for following up with such details Yongzhi, this will come
useful to others! Good to know you are now unblocked as well.
On Wed, Sep 19, 2012 at 9:12 AM, Yongzhi Wang
<wa...@gmail.com> wrote:
> Hi, Harsh
>
> Thanks for your suggestion. It works!
>
> However, you will met an error while building the taskcontroller:
>
> [exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used
> with -D_FILE_OFFSET_BITS==64"
>
> This is a known bug, you can find more information in
> https://issues.apache.org/jira/browse/MAPREDUCE-2178
>
> The solution is as follows:
>
> Another amendment: AC_SYS_LARGEFILE in configure.ac causes a build
> failure on 32-bit RHEL5:
>
> [exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be
> used with -D_FILE_OFFSET_BITS==64"
>
> Since the task-controller doesn't need to deal with very large
> (>2G) files, I don't think this flag is necessary. (it just copies
> job.xml, tokens, and taskjvm.sh)
>
> Thank you very much!
> Yongzhi
>
> On Mon, Sep 17, 2012 at 11:38 PM, Harsh J <ha...@cloudera.com> wrote:
>> Hey Yongzhi,
>>
>> You'll need to build it manually on your 32-bit machine. Run the build
>> from your $HADOOP_HOME/$HADOOP_PREFIX as:
>>
>> (Depending on your version you may or may not require jdk5, we had
>> removed that dependency recently. The rest of the tools mentioned, you
>> should require for a full tar generation.)
>>
>> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
>> -Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
>> -Dxercescroot=$HOME/tools/xerces/latest
>> -Declipse.home=$HOME/tools/eclipse/latest
>> -Djava5.home=$HOME/tools/java5/latest
>> -Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
>> -Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
>> create-c++-configure task-controller tar
>>
>> This should build a fully formed tarball that should have 32-bit
>> natives in it. Use this tarball to deploy your cluster.
>>
>> Or if you just want the task-controller and the natives for your arch,
>> do (Shouldn't require any tool dependencies):
>>
>> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
>> -Dlibrecordio=true veryclean clean compile create-c++-configure
>> task-controller
>>
>> Alternatively, I encourage looking at using packages for your arch,
>> and at the Apache Bigtop (incubating) project:
>> http://incubator.apache.org/bigtop
>>
>> On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
>> <wa...@gmail.com> wrote:
>>> Dear All,
>>>
>>> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
>>> think need a 32-bit binary file taskcontroller. However, I found the
>>> binary
>>> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
>>> build file from server jenkins
>>> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
>>> It's still a 64 bit file.
>>>
>>> I got the following errors when I start task tracker using the hadoop
>>> 64-bit taskcontroller:
>>>
>>> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
>>> because java.io.IOException: Task controller setup failed because of
>>> invalidpermissions/ownership with exit code 126
>>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
>>> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
>>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
>>> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
>>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
>>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
>>> binary file
>>>
>>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>>> at org.apache.hadoop.util.Shell.run(Shell.java:182)
>>> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>>>
>>> I am wondering if not providing 32-bit of taskcontroller is a build
>>> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
>>> platform? If no 32-bit executable is provided in the daily build of
>>> hadoop, how can I build one by myself?
>>>
>>> Thanks!
>>
>>
>>
>> --
>> Harsh J
--
Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Harsh J <ha...@cloudera.com>.
Thanks for following up with such details Yongzhi, this will come
useful to others! Good to know you are now unblocked as well.
On Wed, Sep 19, 2012 at 9:12 AM, Yongzhi Wang
<wa...@gmail.com> wrote:
> Hi, Harsh
>
> Thanks for your suggestion. It works!
>
> However, you will met an error while building the taskcontroller:
>
> [exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used
> with -D_FILE_OFFSET_BITS==64"
>
> This is a known bug, you can find more information in
> https://issues.apache.org/jira/browse/MAPREDUCE-2178
>
> The solution is as follows:
>
> Another amendment: AC_SYS_LARGEFILE in configure.ac causes a build
> failure on 32-bit RHEL5:
>
> [exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be
> used with -D_FILE_OFFSET_BITS==64"
>
> Since the task-controller doesn't need to deal with very large
> (>2G) files, I don't think this flag is necessary. (it just copies
> job.xml, tokens, and taskjvm.sh)
>
> Thank you very much!
> Yongzhi
>
> On Mon, Sep 17, 2012 at 11:38 PM, Harsh J <ha...@cloudera.com> wrote:
>> Hey Yongzhi,
>>
>> You'll need to build it manually on your 32-bit machine. Run the build
>> from your $HADOOP_HOME/$HADOOP_PREFIX as:
>>
>> (Depending on your version you may or may not require jdk5, we had
>> removed that dependency recently. The rest of the tools mentioned, you
>> should require for a full tar generation.)
>>
>> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
>> -Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
>> -Dxercescroot=$HOME/tools/xerces/latest
>> -Declipse.home=$HOME/tools/eclipse/latest
>> -Djava5.home=$HOME/tools/java5/latest
>> -Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
>> -Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
>> create-c++-configure task-controller tar
>>
>> This should build a fully formed tarball that should have 32-bit
>> natives in it. Use this tarball to deploy your cluster.
>>
>> Or if you just want the task-controller and the natives for your arch,
>> do (Shouldn't require any tool dependencies):
>>
>> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
>> -Dlibrecordio=true veryclean clean compile create-c++-configure
>> task-controller
>>
>> Alternatively, I encourage looking at using packages for your arch,
>> and at the Apache Bigtop (incubating) project:
>> http://incubator.apache.org/bigtop
>>
>> On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
>> <wa...@gmail.com> wrote:
>>> Dear All,
>>>
>>> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
>>> think need a 32-bit binary file taskcontroller. However, I found the
>>> binary
>>> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
>>> build file from server jenkins
>>> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
>>> It's still a 64 bit file.
>>>
>>> I got the following errors when I start task tracker using the hadoop
>>> 64-bit taskcontroller:
>>>
>>> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
>>> because java.io.IOException: Task controller setup failed because of
>>> invalidpermissions/ownership with exit code 126
>>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
>>> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
>>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
>>> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
>>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
>>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
>>> binary file
>>>
>>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>>> at org.apache.hadoop.util.Shell.run(Shell.java:182)
>>> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>>>
>>> I am wondering if not providing 32-bit of taskcontroller is a build
>>> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
>>> platform? If no 32-bit executable is provided in the daily build of
>>> hadoop, how can I build one by myself?
>>>
>>> Thanks!
>>
>>
>>
>> --
>> Harsh J
--
Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Harsh J <ha...@cloudera.com>.
Thanks for following up with such details Yongzhi, this will come
useful to others! Good to know you are now unblocked as well.
On Wed, Sep 19, 2012 at 9:12 AM, Yongzhi Wang
<wa...@gmail.com> wrote:
> Hi, Harsh
>
> Thanks for your suggestion. It works!
>
> However, you will met an error while building the taskcontroller:
>
> [exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used
> with -D_FILE_OFFSET_BITS==64"
>
> This is a known bug, you can find more information in
> https://issues.apache.org/jira/browse/MAPREDUCE-2178
>
> The solution is as follows:
>
> Another amendment: AC_SYS_LARGEFILE in configure.ac causes a build
> failure on 32-bit RHEL5:
>
> [exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be
> used with -D_FILE_OFFSET_BITS==64"
>
> Since the task-controller doesn't need to deal with very large
> (>2G) files, I don't think this flag is necessary. (it just copies
> job.xml, tokens, and taskjvm.sh)
>
> Thank you very much!
> Yongzhi
>
> On Mon, Sep 17, 2012 at 11:38 PM, Harsh J <ha...@cloudera.com> wrote:
>> Hey Yongzhi,
>>
>> You'll need to build it manually on your 32-bit machine. Run the build
>> from your $HADOOP_HOME/$HADOOP_PREFIX as:
>>
>> (Depending on your version you may or may not require jdk5, we had
>> removed that dependency recently. The rest of the tools mentioned, you
>> should require for a full tar generation.)
>>
>> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
>> -Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
>> -Dxercescroot=$HOME/tools/xerces/latest
>> -Declipse.home=$HOME/tools/eclipse/latest
>> -Djava5.home=$HOME/tools/java5/latest
>> -Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
>> -Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
>> create-c++-configure task-controller tar
>>
>> This should build a fully formed tarball that should have 32-bit
>> natives in it. Use this tarball to deploy your cluster.
>>
>> Or if you just want the task-controller and the natives for your arch,
>> do (Shouldn't require any tool dependencies):
>>
>> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
>> -Dlibrecordio=true veryclean clean compile create-c++-configure
>> task-controller
>>
>> Alternatively, I encourage looking at using packages for your arch,
>> and at the Apache Bigtop (incubating) project:
>> http://incubator.apache.org/bigtop
>>
>> On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
>> <wa...@gmail.com> wrote:
>>> Dear All,
>>>
>>> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
>>> think need a 32-bit binary file taskcontroller. However, I found the
>>> binary
>>> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
>>> build file from server jenkins
>>> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
>>> It's still a 64 bit file.
>>>
>>> I got the following errors when I start task tracker using the hadoop
>>> 64-bit taskcontroller:
>>>
>>> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
>>> because java.io.IOException: Task controller setup failed because of
>>> invalidpermissions/ownership with exit code 126
>>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
>>> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
>>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
>>> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
>>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
>>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
>>> binary file
>>>
>>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>>> at org.apache.hadoop.util.Shell.run(Shell.java:182)
>>> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>>>
>>> I am wondering if not providing 32-bit of taskcontroller is a build
>>> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
>>> platform? If no 32-bit executable is provided in the daily build of
>>> hadoop, how can I build one by myself?
>>>
>>> Thanks!
>>
>>
>>
>> --
>> Harsh J
--
Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Yongzhi Wang <wa...@gmail.com>.
Hi, Harsh
Thanks for your suggestion. It works!
However, you will met an error while building the taskcontroller:
[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used
with -D_FILE_OFFSET_BITS==64"
This is a known bug, you can find more information in
https://issues.apache.org/jira/browse/MAPREDUCE-2178
The solution is as follows:
Another amendment: AC_SYS_LARGEFILE in configure.ac causes a build
failure on 32-bit RHEL5:
[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be
used with -D_FILE_OFFSET_BITS==64"
Since the task-controller doesn't need to deal with very large
(>2G) files, I don't think this flag is necessary. (it just copies
job.xml, tokens, and taskjvm.sh)
Thank you very much!
Yongzhi
On Mon, Sep 17, 2012 at 11:38 PM, Harsh J <ha...@cloudera.com> wrote:
> Hey Yongzhi,
>
> You'll need to build it manually on your 32-bit machine. Run the build
> from your $HADOOP_HOME/$HADOOP_PREFIX as:
>
> (Depending on your version you may or may not require jdk5, we had
> removed that dependency recently. The rest of the tools mentioned, you
> should require for a full tar generation.)
>
> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
> -Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
> -Dxercescroot=$HOME/tools/xerces/latest
> -Declipse.home=$HOME/tools/eclipse/latest
> -Djava5.home=$HOME/tools/java5/latest
> -Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
> -Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
> create-c++-configure task-controller tar
>
> This should build a fully formed tarball that should have 32-bit
> natives in it. Use this tarball to deploy your cluster.
>
> Or if you just want the task-controller and the natives for your arch,
> do (Shouldn't require any tool dependencies):
>
> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
> -Dlibrecordio=true veryclean clean compile create-c++-configure
> task-controller
>
> Alternatively, I encourage looking at using packages for your arch,
> and at the Apache Bigtop (incubating) project:
> http://incubator.apache.org/bigtop
>
> On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
> <wa...@gmail.com> wrote:
>> Dear All,
>>
>> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
>> think need a 32-bit binary file taskcontroller. However, I found the
>> binary
>> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
>> build file from server jenkins
>> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
>> It's still a 64 bit file.
>>
>> I got the following errors when I start task tracker using the hadoop
>> 64-bit taskcontroller:
>>
>> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
>> because java.io.IOException: Task controller setup failed because of
>> invalidpermissions/ownership with exit code 126
>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
>> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
>> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
>> binary file
>>
>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>> at org.apache.hadoop.util.Shell.run(Shell.java:182)
>> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>>
>> I am wondering if not providing 32-bit of taskcontroller is a build
>> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
>> platform? If no 32-bit executable is provided in the daily build of
>> hadoop, how can I build one by myself?
>>
>> Thanks!
>
>
>
> --
> Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Yongzhi Wang <wa...@gmail.com>.
Hi, Harsh
Thanks for your suggestion. It works!
However, you will met an error while building the taskcontroller:
[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used
with -D_FILE_OFFSET_BITS==64"
This is a known bug, you can find more information in
https://issues.apache.org/jira/browse/MAPREDUCE-2178
The solution is as follows:
Another amendment: AC_SYS_LARGEFILE in configure.ac causes a build
failure on 32-bit RHEL5:
[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be
used with -D_FILE_OFFSET_BITS==64"
Since the task-controller doesn't need to deal with very large
(>2G) files, I don't think this flag is necessary. (it just copies
job.xml, tokens, and taskjvm.sh)
Thank you very much!
Yongzhi
On Mon, Sep 17, 2012 at 11:38 PM, Harsh J <ha...@cloudera.com> wrote:
> Hey Yongzhi,
>
> You'll need to build it manually on your 32-bit machine. Run the build
> from your $HADOOP_HOME/$HADOOP_PREFIX as:
>
> (Depending on your version you may or may not require jdk5, we had
> removed that dependency recently. The rest of the tools mentioned, you
> should require for a full tar generation.)
>
> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
> -Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
> -Dxercescroot=$HOME/tools/xerces/latest
> -Declipse.home=$HOME/tools/eclipse/latest
> -Djava5.home=$HOME/tools/java5/latest
> -Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
> -Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
> create-c++-configure task-controller tar
>
> This should build a fully formed tarball that should have 32-bit
> natives in it. Use this tarball to deploy your cluster.
>
> Or if you just want the task-controller and the natives for your arch,
> do (Shouldn't require any tool dependencies):
>
> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
> -Dlibrecordio=true veryclean clean compile create-c++-configure
> task-controller
>
> Alternatively, I encourage looking at using packages for your arch,
> and at the Apache Bigtop (incubating) project:
> http://incubator.apache.org/bigtop
>
> On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
> <wa...@gmail.com> wrote:
>> Dear All,
>>
>> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
>> think need a 32-bit binary file taskcontroller. However, I found the
>> binary
>> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
>> build file from server jenkins
>> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
>> It's still a 64 bit file.
>>
>> I got the following errors when I start task tracker using the hadoop
>> 64-bit taskcontroller:
>>
>> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
>> because java.io.IOException: Task controller setup failed because of
>> invalidpermissions/ownership with exit code 126
>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
>> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
>> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
>> binary file
>>
>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>> at org.apache.hadoop.util.Shell.run(Shell.java:182)
>> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>>
>> I am wondering if not providing 32-bit of taskcontroller is a build
>> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
>> platform? If no 32-bit executable is provided in the daily build of
>> hadoop, how can I build one by myself?
>>
>> Thanks!
>
>
>
> --
> Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Yongzhi Wang <wa...@gmail.com>.
Hi, Harsh
Thanks for your suggestion. It works!
However, you will met an error while building the taskcontroller:
[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used
with -D_FILE_OFFSET_BITS==64"
This is a known bug, you can find more information in
https://issues.apache.org/jira/browse/MAPREDUCE-2178
The solution is as follows:
Another amendment: AC_SYS_LARGEFILE in configure.ac causes a build
failure on 32-bit RHEL5:
[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be
used with -D_FILE_OFFSET_BITS==64"
Since the task-controller doesn't need to deal with very large
(>2G) files, I don't think this flag is necessary. (it just copies
job.xml, tokens, and taskjvm.sh)
Thank you very much!
Yongzhi
On Mon, Sep 17, 2012 at 11:38 PM, Harsh J <ha...@cloudera.com> wrote:
> Hey Yongzhi,
>
> You'll need to build it manually on your 32-bit machine. Run the build
> from your $HADOOP_HOME/$HADOOP_PREFIX as:
>
> (Depending on your version you may or may not require jdk5, we had
> removed that dependency recently. The rest of the tools mentioned, you
> should require for a full tar generation.)
>
> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
> -Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
> -Dxercescroot=$HOME/tools/xerces/latest
> -Declipse.home=$HOME/tools/eclipse/latest
> -Djava5.home=$HOME/tools/java5/latest
> -Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
> -Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
> create-c++-configure task-controller tar
>
> This should build a fully formed tarball that should have 32-bit
> natives in it. Use this tarball to deploy your cluster.
>
> Or if you just want the task-controller and the natives for your arch,
> do (Shouldn't require any tool dependencies):
>
> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
> -Dlibrecordio=true veryclean clean compile create-c++-configure
> task-controller
>
> Alternatively, I encourage looking at using packages for your arch,
> and at the Apache Bigtop (incubating) project:
> http://incubator.apache.org/bigtop
>
> On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
> <wa...@gmail.com> wrote:
>> Dear All,
>>
>> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
>> think need a 32-bit binary file taskcontroller. However, I found the
>> binary
>> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
>> build file from server jenkins
>> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
>> It's still a 64 bit file.
>>
>> I got the following errors when I start task tracker using the hadoop
>> 64-bit taskcontroller:
>>
>> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
>> because java.io.IOException: Task controller setup failed because of
>> invalidpermissions/ownership with exit code 126
>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
>> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
>> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
>> binary file
>>
>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>> at org.apache.hadoop.util.Shell.run(Shell.java:182)
>> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>>
>> I am wondering if not providing 32-bit of taskcontroller is a build
>> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
>> platform? If no 32-bit executable is provided in the daily build of
>> hadoop, how can I build one by myself?
>>
>> Thanks!
>
>
>
> --
> Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Yongzhi Wang <wa...@gmail.com>.
Hi, Harsh
Thanks for your suggestion. It works!
However, you will met an error while building the taskcontroller:
[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used
with -D_FILE_OFFSET_BITS==64"
This is a known bug, you can find more information in
https://issues.apache.org/jira/browse/MAPREDUCE-2178
The solution is as follows:
Another amendment: AC_SYS_LARGEFILE in configure.ac causes a build
failure on 32-bit RHEL5:
[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be
used with -D_FILE_OFFSET_BITS==64"
Since the task-controller doesn't need to deal with very large
(>2G) files, I don't think this flag is necessary. (it just copies
job.xml, tokens, and taskjvm.sh)
Thank you very much!
Yongzhi
On Mon, Sep 17, 2012 at 11:38 PM, Harsh J <ha...@cloudera.com> wrote:
> Hey Yongzhi,
>
> You'll need to build it manually on your 32-bit machine. Run the build
> from your $HADOOP_HOME/$HADOOP_PREFIX as:
>
> (Depending on your version you may or may not require jdk5, we had
> removed that dependency recently. The rest of the tools mentioned, you
> should require for a full tar generation.)
>
> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
> -Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
> -Dxercescroot=$HOME/tools/xerces/latest
> -Declipse.home=$HOME/tools/eclipse/latest
> -Djava5.home=$HOME/tools/java5/latest
> -Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
> -Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
> create-c++-configure task-controller tar
>
> This should build a fully formed tarball that should have 32-bit
> natives in it. Use this tarball to deploy your cluster.
>
> Or if you just want the task-controller and the natives for your arch,
> do (Shouldn't require any tool dependencies):
>
> ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
> -Dlibrecordio=true veryclean clean compile create-c++-configure
> task-controller
>
> Alternatively, I encourage looking at using packages for your arch,
> and at the Apache Bigtop (incubating) project:
> http://incubator.apache.org/bigtop
>
> On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
> <wa...@gmail.com> wrote:
>> Dear All,
>>
>> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
>> think need a 32-bit binary file taskcontroller. However, I found the
>> binary
>> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
>> build file from server jenkins
>> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
>> It's still a 64 bit file.
>>
>> I got the following errors when I start task tracker using the hadoop
>> 64-bit taskcontroller:
>>
>> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
>> because java.io.IOException: Task controller setup failed because of
>> invalidpermissions/ownership with exit code 126
>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
>> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
>> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
>> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
>> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
>> binary file
>>
>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>> at org.apache.hadoop.util.Shell.run(Shell.java:182)
>> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>>
>> I am wondering if not providing 32-bit of taskcontroller is a build
>> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
>> platform? If no 32-bit executable is provided in the daily build of
>> hadoop, how can I build one by myself?
>>
>> Thanks!
>
>
>
> --
> Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Harsh J <ha...@cloudera.com>.
Hey Yongzhi,
You'll need to build it manually on your 32-bit machine. Run the build
from your $HADOOP_HOME/$HADOOP_PREFIX as:
(Depending on your version you may or may not require jdk5, we had
removed that dependency recently. The rest of the tools mentioned, you
should require for a full tar generation.)
ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
-Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
-Dxercescroot=$HOME/tools/xerces/latest
-Declipse.home=$HOME/tools/eclipse/latest
-Djava5.home=$HOME/tools/java5/latest
-Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
-Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
create-c++-configure task-controller tar
This should build a fully formed tarball that should have 32-bit
natives in it. Use this tarball to deploy your cluster.
Or if you just want the task-controller and the natives for your arch,
do (Shouldn't require any tool dependencies):
ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
-Dlibrecordio=true veryclean clean compile create-c++-configure
task-controller
Alternatively, I encourage looking at using packages for your arch,
and at the Apache Bigtop (incubating) project:
http://incubator.apache.org/bigtop
On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
<wa...@gmail.com> wrote:
> Dear All,
>
> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
> think need a 32-bit binary file taskcontroller. However, I found the
> binary
> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
> build file from server jenkins
> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
> It's still a 64 bit file.
>
> I got the following errors when I start task tracker using the hadoop
> 64-bit taskcontroller:
>
> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
> because java.io.IOException: Task controller setup failed because of
> invalidpermissions/ownership with exit code 126
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
> binary file
>
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
> at org.apache.hadoop.util.Shell.run(Shell.java:182)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>
> I am wondering if not providing 32-bit of taskcontroller is a build
> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
> platform? If no 32-bit executable is provided in the daily build of
> hadoop, how can I build one by myself?
>
> Thanks!
--
Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Harsh J <ha...@cloudera.com>.
Hey Yongzhi,
You'll need to build it manually on your 32-bit machine. Run the build
from your $HADOOP_HOME/$HADOOP_PREFIX as:
(Depending on your version you may or may not require jdk5, we had
removed that dependency recently. The rest of the tools mentioned, you
should require for a full tar generation.)
ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
-Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
-Dxercescroot=$HOME/tools/xerces/latest
-Declipse.home=$HOME/tools/eclipse/latest
-Djava5.home=$HOME/tools/java5/latest
-Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
-Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
create-c++-configure task-controller tar
This should build a fully formed tarball that should have 32-bit
natives in it. Use this tarball to deploy your cluster.
Or if you just want the task-controller and the natives for your arch,
do (Shouldn't require any tool dependencies):
ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
-Dlibrecordio=true veryclean clean compile create-c++-configure
task-controller
Alternatively, I encourage looking at using packages for your arch,
and at the Apache Bigtop (incubating) project:
http://incubator.apache.org/bigtop
On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
<wa...@gmail.com> wrote:
> Dear All,
>
> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
> think need a 32-bit binary file taskcontroller. However, I found the
> binary
> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
> build file from server jenkins
> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
> It's still a 64 bit file.
>
> I got the following errors when I start task tracker using the hadoop
> 64-bit taskcontroller:
>
> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
> because java.io.IOException: Task controller setup failed because of
> invalidpermissions/ownership with exit code 126
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
> binary file
>
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
> at org.apache.hadoop.util.Shell.run(Shell.java:182)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>
> I am wondering if not providing 32-bit of taskcontroller is a build
> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
> platform? If no 32-bit executable is provided in the daily build of
> hadoop, how can I build one by myself?
>
> Thanks!
--
Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Harsh J <ha...@cloudera.com>.
Hey Yongzhi,
You'll need to build it manually on your 32-bit machine. Run the build
from your $HADOOP_HOME/$HADOOP_PREFIX as:
(Depending on your version you may or may not require jdk5, we had
removed that dependency recently. The rest of the tools mentioned, you
should require for a full tar generation.)
ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
-Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
-Dxercescroot=$HOME/tools/xerces/latest
-Declipse.home=$HOME/tools/eclipse/latest
-Djava5.home=$HOME/tools/java5/latest
-Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
-Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
create-c++-configure task-controller tar
This should build a fully formed tarball that should have 32-bit
natives in it. Use this tarball to deploy your cluster.
Or if you just want the task-controller and the natives for your arch,
do (Shouldn't require any tool dependencies):
ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
-Dlibrecordio=true veryclean clean compile create-c++-configure
task-controller
Alternatively, I encourage looking at using packages for your arch,
and at the Apache Bigtop (incubating) project:
http://incubator.apache.org/bigtop
On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
<wa...@gmail.com> wrote:
> Dear All,
>
> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
> think need a 32-bit binary file taskcontroller. However, I found the
> binary
> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
> build file from server jenkins
> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
> It's still a 64 bit file.
>
> I got the following errors when I start task tracker using the hadoop
> 64-bit taskcontroller:
>
> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
> because java.io.IOException: Task controller setup failed because of
> invalidpermissions/ownership with exit code 126
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
> binary file
>
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
> at org.apache.hadoop.util.Shell.run(Shell.java:182)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>
> I am wondering if not providing 32-bit of taskcontroller is a build
> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
> platform? If no 32-bit executable is provided in the daily build of
> hadoop, how can I build one by myself?
>
> Thanks!
--
Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Harsh J <ha...@cloudera.com>.
Hey Yongzhi,
You'll need to build it manually on your 32-bit machine. Run the build
from your $HADOOP_HOME/$HADOOP_PREFIX as:
(Depending on your version you may or may not require jdk5, we had
removed that dependency recently. The rest of the tools mentioned, you
should require for a full tar generation.)
ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
-Dlibrecordio=true -Dhadoop.conf.dir=/etc/hadoop
-Dxercescroot=$HOME/tools/xerces/latest
-Declipse.home=$HOME/tools/eclipse/latest
-Djava5.home=$HOME/tools/java5/latest
-Dforrest.home=$HOME/tools/forrest/apache-forrest-0.8
-Dfindbugs.home=$HOME/tools/findbugs/latest veryclean clean
create-c++-configure task-controller tar
This should build a fully formed tarball that should have 32-bit
natives in it. Use this tarball to deploy your cluster.
Or if you just want the task-controller and the natives for your arch,
do (Shouldn't require any tool dependencies):
ant -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true
-Dlibrecordio=true veryclean clean compile create-c++-configure
task-controller
Alternatively, I encourage looking at using packages for your arch,
and at the Apache Bigtop (incubating) project:
http://incubator.apache.org/bigtop
On Tue, Sep 18, 2012 at 8:34 AM, Yongzhi Wang
<wa...@gmail.com> wrote:
> Dear All,
>
> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
> think need a 32-bit binary file taskcontroller. However, I found the
> binary
> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
> build file from server jenkins
> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
> It's still a 64 bit file.
>
> I got the following errors when I start task tracker using the hadoop
> 64-bit taskcontroller:
>
> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
> because java.io.IOException: Task controller setup failed because of
> invalidpermissions/ownership with exit code 126
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
> binary file
>
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
> at org.apache.hadoop.util.Shell.run(Shell.java:182)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>
> I am wondering if not providing 32-bit of taskcontroller is a build
> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
> platform? If no 32-bit executable is provided in the daily build of
> hadoop, how can I build one by myself?
>
> Thanks!
--
Harsh J
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Arpit Gupta <ar...@hortonworks.com>.
Take a look at
http://hadoop.apache.org/docs/r1.0.3/cluster_setup.html
and look for 'Using the LinuxTaskController'
It has the info on what the permission and ownership of the task controller executable should be.
--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/
On Sep 17, 2012, at 8:04 PM, Yongzhi Wang <wa...@gmail.com> wrote:
> Dear All,
>
> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
> think need a 32-bit binary file taskcontroller. However, I found the
> binary
> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
> build file from server jenkins
> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
> It's still a 64 bit file.
>
> I got the following errors when I start task tracker using the hadoop
> 64-bit taskcontroller:
>
> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
> because java.io.IOException: Task controller setup failed because of
> invalidpermissions/ownership with exit code 126
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
> binary file
>
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
> at org.apache.hadoop.util.Shell.run(Shell.java:182)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>
> I am wondering if not providing 32-bit of taskcontroller is a build
> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
> platform? If no 32-bit executable is provided in the daily build of
> hadoop, how can I build one by myself?
>
> Thanks!
Re: No 32-bit taskcontroller on Hadoop 1.0.3
Posted by Arpit Gupta <ar...@hortonworks.com>.
Take a look at
http://hadoop.apache.org/docs/r1.0.3/cluster_setup.html
and look for 'Using the LinuxTaskController'
It has the info on what the permission and ownership of the task controller executable should be.
--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/
On Sep 17, 2012, at 8:04 PM, Yongzhi Wang <wa...@gmail.com> wrote:
> Dear All,
>
> I am currently deploying hadoop 1.0.3 on my Debian 32-bit Linux. I
> think need a 32-bit binary file taskcontroller. However, I found the
> binary
> files provided in hadoop 1.0.3 is 64 bit. I downloaded the hadoop
> build file from server jenkins
> (https://builds.apache.org/job/Hadoop-1.0-Build/ws/trunk/build/c++-build/Linux-i386-32/task-controller/).
> It's still a 64 bit file.
>
> I got the following errors when I start task tracker using the hadoop
> 64-bit taskcontroller:
>
> 12/09/17 11:59:58 ERROR mapred.TaskTracker: Can not start task tracker
> because java.io.IOException: Task controller setup failed because of
> invalidpermissions/ownership with exit code 126
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:143)
> at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1452)
> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3742)
> Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller:
> /opt/ywang/hadoop-1.0.3/libexec/../bin/task-controller: cannot execute
> binary file
>
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
> at org.apache.hadoop.util.Shell.run(Shell.java:182)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> at org.apache.hadoop.mapred.LinuxTaskController.setup(LinuxTaskController.java:137)
>
> I am wondering if not providing 32-bit of taskcontroller is a build
> bug, or 64-bit taskcontroller can be used somehow on the 32-bit
> platform? If no 32-bit executable is provided in the daily build of
> hadoop, how can I build one by myself?
>
> Thanks!