You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Chris Douglas (JIRA)" <ji...@apache.org> on 2010/02/12 11:08:28 UTC
[jira] Updated: (HADOOP-5612) Some c++ scripts are not chmodded
before ant execution
[ https://issues.apache.org/jira/browse/HADOOP-5612?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Chris Douglas updated HADOOP-5612:
----------------------------------
Attachment: C5612-1.patch
I required more changes to 0.20 to get this to build on MacOS (chmod install-sh as well). The error.h import in hdfsJniHelper.c also caused problems on MacOS, but doesn't appear to be required (on RHEL, anyway; errno is included in hdfsJniHelper.h, which covered all the defs I found). Todd, would you mind confirming in your environment?
> Some c++ scripts are not chmodded before ant execution
> ------------------------------------------------------
>
> Key: HADOOP-5612
> URL: https://issues.apache.org/jira/browse/HADOOP-5612
> Project: Hadoop Common
> Issue Type: Bug
> Components: build
> Reporter: Todd Lipcon
> Assignee: Todd Lipcon
> Fix For: 0.21.0
>
> Attachments: 0001-HADOOP-5612-Add-chmod-rules-to-build.xml-to-make.patch, C5612-1.patch
>
>
> Before executing a lot of the configure scripts, there are lines like:
> <chmod file="${c++.libhdfs.src}/configure" perm="ugo+x"/>
> that ensure the configure script is executable (since they seem to not always make it into the distribution in this form).
> These chmods are missing for Pipes and C++ Utils which makes the build fail.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.