You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2011/10/26 21:28:18 UTC

Hadoop-Mapreduce-trunk-Commit - Build # 1174 - Failure

See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1174/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14773 lines...]
    [junit] Running org.apache.hadoop.mapred.TestMapRed
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 22.079 sec
    [junit] Running org.apache.hadoop.mapred.TestMiniMRDFSCaching
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 35.003 sec
    [junit] Running org.apache.hadoop.mapred.TestQueueAclsForCurrentUser
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.62 sec
    [junit] Running org.apache.hadoop.mapred.TestRackAwareTaskPlacement
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 1.518 sec
    [junit] Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 30.992 sec
    [junit] Running org.apache.hadoop.mapred.TestReduceTask
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.624 sec
    [junit] Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryInputFormat
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.776 sec
    [junit] Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.975 sec
    [junit] Running org.apache.hadoop.mapred.TestSequenceFileInputFormat
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 5.811 sec
    [junit] Running org.apache.hadoop.mapred.TestSeveral
    [junit] Running org.apache.hadoop.mapred.TestSeveral
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.mapred.TestSeveral FAILED (timeout)
    [junit] Running org.apache.hadoop.mapred.TestSpeculativeExecution
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 4.552 sec
    [junit] Running org.apache.hadoop.mapred.TestTaskLimits
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 3.949 sec
    [junit] Running org.apache.hadoop.mapred.TestTaskTrackerBlacklisting
    [junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 1.804 sec
    [junit] Running org.apache.hadoop.mapred.TestTextInputFormat
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 62.263 sec
    [junit] Running org.apache.hadoop.mapred.TestTextOutputFormat
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.184 sec
    [junit] Running org.apache.hadoop.mapred.TestTrackerBlacklistAcrossJobs
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 41.975 sec
    [junit] Running org.apache.hadoop.mapreduce.TestCounters
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.346 sec
    [junit] Running org.apache.hadoop.mapreduce.TestMapCollection
    [junit] Tests run: 11, Failures: 0, Errors: 0, Time elapsed: 21.929 sec
    [junit] Running org.apache.hadoop.mapreduce.TestMapReduceLocal
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 27.247 sec
    [junit] Running org.apache.hadoop.mapreduce.lib.input.TestFileInputFormat
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.977 sec
    [junit] Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.62 sec

checkfailure:
    [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:792: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:755: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:816: Tests failed!

Total time: 21 minutes 37 seconds
Build step 'Execute shell' marked build as failure
Recording test results
Updating HDFS-2494
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk-Commit - Build # 1177 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1177/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9724 lines...]
[ivy:resolve] downloading http://repo1.maven.org/maven2/commons-lang/commons-lang/1.0/commons-lang-1.0.jar ...
[ivy:resolve] .................................................................................................................... (62kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] commons-lang#commons-lang;1.0!commons-lang.jar (988ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/junit/junit/3.7/junit-3.7.jar ...
[ivy:resolve] ...................................................................................................................................................................................................................... (114kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] junit#junit;3.7!junit.jar (1065ms)

ivy-retrieve-checkstyle:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/ivy/ivysettings.xml

check-for-checkstyle:

checkstyle:
[checkstyle] Running Checkstyle 4.2 on 71 files
     [xslt] Processing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/test/checkstyle-errors.xml to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/test/checkstyle-errors.html
     [xslt] Loading stylesheet /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/test/checkstyle-noframes-sorted.xsl

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
      [get] To: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/ivy/ivy-2.2.0.jar
      [get] Not modified - so not downloaded

ivy-init-dirs:

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve-common:

ivy-retrieve-common:

ivy-resolve-mapred:

ivy-retrieve-mapred:

init:
    [touch] Creating /tmp/null1217553691
   [delete] Deleting: /tmp/null1217553691
    [unzip] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar into /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build

check-c++-configure:

create-c++-utils-configure:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1849: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/utils"): java.io.IOException: error=2, No such file or directory

Total time: 30 seconds
Build step 'Execute shell' marked build as failure
Recording test results
Updating HADOOP-7768
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk-Commit - Build # 1176 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1176/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9725 lines...]
[ivy:resolve] .................................................................................................................... (62kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] commons-lang#commons-lang;1.0!commons-lang.jar (1015ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/junit/junit/3.7/junit-3.7.jar ...
[ivy:resolve] ...................................................................................................................................................................................................................... (114kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] junit#junit;3.7!junit.jar (1041ms)

ivy-retrieve-checkstyle:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/ivy/ivysettings.xml

check-for-checkstyle:

checkstyle:
[checkstyle] Running Checkstyle 4.2 on 71 files
     [xslt] Processing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/test/checkstyle-errors.xml to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/test/checkstyle-errors.html
     [xslt] Loading stylesheet /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/test/checkstyle-noframes-sorted.xsl

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
      [get] To: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/ivy/ivy-2.2.0.jar
      [get] Not modified - so not downloaded

ivy-init-dirs:

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve-common:

ivy-retrieve-common:

ivy-resolve-mapred:

ivy-retrieve-mapred:

init:
    [touch] Creating /tmp/null495614801
   [delete] Deleting: /tmp/null495614801
    [unzip] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar into /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build

check-c++-configure:

create-c++-utils-configure:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1849: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/utils"): java.io.IOException: error=2, No such file or directory

Total time: 30 seconds
Build step 'Execute shell' marked build as failure
Recording test results
Updating HDFS-1869
Updating MAPREDUCE-3205
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Re: Hadoop-Mapreduce-trunk-Commit - Build # 1175 - Still Failing

Posted by Todd Lipcon <to...@cloudera.com>.
For config management purposes: we also need libc6-dev-i386 for the
32-bit cross-compiles to work.

-Todd

On Fri, Oct 28, 2011 at 11:51 AM, giridharan kesavan
<gk...@hortonworks.com> wrote:
> I 've added hadoop5 to the mapreduce build pool, which is causing this. I'm
> lookin into this.
> -Giri
>
> On 10/28/11 11:05 AM, Todd Lipcon wrote:
>>
>> Not sure exactly what changed on 10/26, but our MR builds are now
>> failing because they can't find libcrypto.so. The odd thing is that
>> libcrypto.so is indeed installed on the build boxes (both 32-bit and
>> 64-bit).
>>
>> Any ideas?
>>
>> On Wed, Oct 26, 2011 at 5:07 PM, Apache Jenkins Server
>> <je...@builds.apache.org>  wrote:
>>>
>>> See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1175/
>>>
>>>
>>> ###################################################################################
>>> ########################## LAST 60 LINES OF THE CONSOLE
>>> ###########################
>>> [...truncated 10036 lines...]
>>>    [touch] Creating /tmp/null1185950683
>>>   [delete] Deleting: /tmp/null1185950683
>>>    [unzip] Expanding:
>>> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar
>>> into
>>> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build
>>>
>>> check-c++-configure:
>>>
>>> create-c++-pipes-configure:
>>>     [exec] checking for a BSD-compatible install... /usr/bin/install -c
>>>     [exec] checking whether build environment is sane... yes
>>>     [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
>>>     [exec] checking for gawk... no
>>>     [exec] checking for mawk... mawk
>>>     [exec] checking whether make sets $(MAKE)... yes
>>>     [exec] checking for style of include used by make... GNU
>>>     [exec] checking for gcc... gcc
>>>     [exec] checking whether the C compiler works... yes
>>>     [exec] checking for C compiler default output file name... a.out
>>>     [exec] checking for suffix of executables...
>>>     [exec] checking whether we are cross compiling... no
>>>     [exec] checking for suffix of object files... o
>>>     [exec] checking whether we are using the GNU C compiler... yes
>>>     [exec] checking whether gcc accepts -g... yes
>>>     [exec] checking for gcc option to accept ISO C89... none needed
>>>     [exec] checking dependency style of gcc... gcc3
>>>     [exec] checking how to run the C preprocessor... gcc -E
>>>     [exec] checking for grep that handles long lines and -e... /bin/grep
>>>     [exec] checking for egrep... /bin/grep -E
>>>     [exec] checking for ANSI C header files... yes
>>>     [exec] checking for sys/types.h... yes
>>>     [exec] checking for sys/stat.h... yes
>>>     [exec] checking for stdlib.h... yes
>>>     [exec] checking for string.h... yes
>>>     [exec] checking for memory.h... yes
>>>     [exec] checking for strings.h... yes
>>>     [exec] checking for inttypes.h... yes
>>>     [exec] checking for stdint.h... yes
>>>     [exec] checking for unistd.h... yes
>>>     [exec] checking minix/config.h usability... no
>>>     [exec] checking minix/config.h presence... no
>>>     [exec] checking for minix/config.h... no
>>>     [exec] checking whether it is safe to define __EXTENSIONS__... yes
>>>     [exec] checking for special C compiler options needed for large
>>> files... no
>>>     [exec] checking for _FILE_OFFSET_BITS value needed for large files...
>>> no
>>>     [exec] checking pthread.h usability... yes
>>>     [exec] checking pthread.h presence... yes
>>>     [exec] checking for pthread.h... yes
>>>     [exec] checking for pthread_create in -lpthread... yes
>>>     [exec] checking for HMAC_Init in -lcrypto... no
>>>     [exec]
>>> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/pipes/configure:
>>> line 266: return: please: numeric argument required
>>>     [exec] configure: error: Cannot find libcrypto.so
>>>
>>> BUILD FAILED
>>>
>>> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1916:
>>> exec returned: 255
>>>
>>> Total time: 29 seconds
>>> Build step 'Execute shell' marked build as failure
>>> Recording test results
>>> Email was triggered for: Failure
>>> Sending email for trigger: Failure
>>>
>>>
>>>
>>>
>>> ###################################################################################
>>> ############################## FAILED TESTS (if any)
>>> ##############################
>>> All tests passed
>>>
>>
>>
>
>
> --
> -Giri
>
>



-- 
Todd Lipcon
Software Engineer, Cloudera

Re: Hadoop-Mapreduce-trunk-Commit - Build # 1175 - Still Failing

Posted by giridharan kesavan <gk...@hortonworks.com>.
I 've added hadoop5 to the mapreduce build pool, which is causing this. 
I'm lookin into this.
-Giri

On 10/28/11 11:05 AM, Todd Lipcon wrote:
> Not sure exactly what changed on 10/26, but our MR builds are now
> failing because they can't find libcrypto.so. The odd thing is that
> libcrypto.so is indeed installed on the build boxes (both 32-bit and
> 64-bit).
>
> Any ideas?
>
> On Wed, Oct 26, 2011 at 5:07 PM, Apache Jenkins Server
> <je...@builds.apache.org>  wrote:
>> See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1175/
>>
>> ###################################################################################
>> ########################## LAST 60 LINES OF THE CONSOLE ###########################
>> [...truncated 10036 lines...]
>>     [touch] Creating /tmp/null1185950683
>>    [delete] Deleting: /tmp/null1185950683
>>     [unzip] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar into /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build
>>
>> check-c++-configure:
>>
>> create-c++-pipes-configure:
>>      [exec] checking for a BSD-compatible install... /usr/bin/install -c
>>      [exec] checking whether build environment is sane... yes
>>      [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
>>      [exec] checking for gawk... no
>>      [exec] checking for mawk... mawk
>>      [exec] checking whether make sets $(MAKE)... yes
>>      [exec] checking for style of include used by make... GNU
>>      [exec] checking for gcc... gcc
>>      [exec] checking whether the C compiler works... yes
>>      [exec] checking for C compiler default output file name... a.out
>>      [exec] checking for suffix of executables...
>>      [exec] checking whether we are cross compiling... no
>>      [exec] checking for suffix of object files... o
>>      [exec] checking whether we are using the GNU C compiler... yes
>>      [exec] checking whether gcc accepts -g... yes
>>      [exec] checking for gcc option to accept ISO C89... none needed
>>      [exec] checking dependency style of gcc... gcc3
>>      [exec] checking how to run the C preprocessor... gcc -E
>>      [exec] checking for grep that handles long lines and -e... /bin/grep
>>      [exec] checking for egrep... /bin/grep -E
>>      [exec] checking for ANSI C header files... yes
>>      [exec] checking for sys/types.h... yes
>>      [exec] checking for sys/stat.h... yes
>>      [exec] checking for stdlib.h... yes
>>      [exec] checking for string.h... yes
>>      [exec] checking for memory.h... yes
>>      [exec] checking for strings.h... yes
>>      [exec] checking for inttypes.h... yes
>>      [exec] checking for stdint.h... yes
>>      [exec] checking for unistd.h... yes
>>      [exec] checking minix/config.h usability... no
>>      [exec] checking minix/config.h presence... no
>>      [exec] checking for minix/config.h... no
>>      [exec] checking whether it is safe to define __EXTENSIONS__... yes
>>      [exec] checking for special C compiler options needed for large files... no
>>      [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
>>      [exec] checking pthread.h usability... yes
>>      [exec] checking pthread.h presence... yes
>>      [exec] checking for pthread.h... yes
>>      [exec] checking for pthread_create in -lpthread... yes
>>      [exec] checking for HMAC_Init in -lcrypto... no
>>      [exec] /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/pipes/configure: line 266: return: please: numeric argument required
>>      [exec] configure: error: Cannot find libcrypto.so
>>
>> BUILD FAILED
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1916: exec returned: 255
>>
>> Total time: 29 seconds
>> Build step 'Execute shell' marked build as failure
>> Recording test results
>> Email was triggered for: Failure
>> Sending email for trigger: Failure
>>
>>
>>
>> ###################################################################################
>> ############################## FAILED TESTS (if any) ##############################
>> All tests passed
>>
>
>


-- 
-Giri


Re: Hadoop-Mapreduce-trunk-Commit - Build # 1175 - Still Failing

Posted by Mahadev Konar <ma...@hortonworks.com>.
Adding Giri. He's fixing some issues with the autoreconf and other
software installations.

mahadev

On Fri, Oct 28, 2011 at 11:12 AM, Todd Lipcon <to...@cloudera.com> wrote:
> Actually, it looks like the libssl0.9.8 package is installed but not
> the libssl-dev package.
>
> Rajiv, would you mind if I installed the libssl-dev package on the
> build machines? It looks like some of them have it and some don't
> (which explains why the build randomly started failing two days ago).
> Or, if there is config management in place, can you please install the
> packages?
>
> Thanks
> -Todd
>
> On Fri, Oct 28, 2011 at 11:05 AM, Todd Lipcon <to...@cloudera.com> wrote:
>> Not sure exactly what changed on 10/26, but our MR builds are now
>> failing because they can't find libcrypto.so. The odd thing is that
>> libcrypto.so is indeed installed on the build boxes (both 32-bit and
>> 64-bit).
>>
>> Any ideas?
>>
>> On Wed, Oct 26, 2011 at 5:07 PM, Apache Jenkins Server
>> <je...@builds.apache.org> wrote:
>>> See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1175/
>>>
>>> ###################################################################################
>>> ########################## LAST 60 LINES OF THE CONSOLE ###########################
>>> [...truncated 10036 lines...]
>>>    [touch] Creating /tmp/null1185950683
>>>   [delete] Deleting: /tmp/null1185950683
>>>    [unzip] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar into /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build
>>>
>>> check-c++-configure:
>>>
>>> create-c++-pipes-configure:
>>>     [exec] checking for a BSD-compatible install... /usr/bin/install -c
>>>     [exec] checking whether build environment is sane... yes
>>>     [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
>>>     [exec] checking for gawk... no
>>>     [exec] checking for mawk... mawk
>>>     [exec] checking whether make sets $(MAKE)... yes
>>>     [exec] checking for style of include used by make... GNU
>>>     [exec] checking for gcc... gcc
>>>     [exec] checking whether the C compiler works... yes
>>>     [exec] checking for C compiler default output file name... a.out
>>>     [exec] checking for suffix of executables...
>>>     [exec] checking whether we are cross compiling... no
>>>     [exec] checking for suffix of object files... o
>>>     [exec] checking whether we are using the GNU C compiler... yes
>>>     [exec] checking whether gcc accepts -g... yes
>>>     [exec] checking for gcc option to accept ISO C89... none needed
>>>     [exec] checking dependency style of gcc... gcc3
>>>     [exec] checking how to run the C preprocessor... gcc -E
>>>     [exec] checking for grep that handles long lines and -e... /bin/grep
>>>     [exec] checking for egrep... /bin/grep -E
>>>     [exec] checking for ANSI C header files... yes
>>>     [exec] checking for sys/types.h... yes
>>>     [exec] checking for sys/stat.h... yes
>>>     [exec] checking for stdlib.h... yes
>>>     [exec] checking for string.h... yes
>>>     [exec] checking for memory.h... yes
>>>     [exec] checking for strings.h... yes
>>>     [exec] checking for inttypes.h... yes
>>>     [exec] checking for stdint.h... yes
>>>     [exec] checking for unistd.h... yes
>>>     [exec] checking minix/config.h usability... no
>>>     [exec] checking minix/config.h presence... no
>>>     [exec] checking for minix/config.h... no
>>>     [exec] checking whether it is safe to define __EXTENSIONS__... yes
>>>     [exec] checking for special C compiler options needed for large files... no
>>>     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
>>>     [exec] checking pthread.h usability... yes
>>>     [exec] checking pthread.h presence... yes
>>>     [exec] checking for pthread.h... yes
>>>     [exec] checking for pthread_create in -lpthread... yes
>>>     [exec] checking for HMAC_Init in -lcrypto... no
>>>     [exec] /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/pipes/configure: line 266: return: please: numeric argument required
>>>     [exec] configure: error: Cannot find libcrypto.so
>>>
>>> BUILD FAILED
>>> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1916: exec returned: 255
>>>
>>> Total time: 29 seconds
>>> Build step 'Execute shell' marked build as failure
>>> Recording test results
>>> Email was triggered for: Failure
>>> Sending email for trigger: Failure
>>>
>>>
>>>
>>> ###################################################################################
>>> ############################## FAILED TESTS (if any) ##############################
>>> All tests passed
>>>
>>
>>
>>
>> --
>> Todd Lipcon
>> Software Engineer, Cloudera
>>
>
>
>
> --
> Todd Lipcon
> Software Engineer, Cloudera
>

Re: Hadoop-Mapreduce-trunk-Commit - Build # 1175 - Still Failing

Posted by Todd Lipcon <to...@cloudera.com>.
Done. I installed on the machines I could find that were up (two
appear to be down). If there is config management in place, would be
good to add it there as well so that, when they come back up, we don't
have this problem again.

-Todd

On Fri, Oct 28, 2011 at 11:20 AM, Rajiv Chittajallu
<ra...@yahoo-inc.com> wrote:
> sure, go ahead.
>
> Todd Lipcon wrote on 10/28/11 at 11:12:26 -0700:
>>Actually, it looks like the libssl0.9.8 package is installed but not
>>the libssl-dev package.
>>
>>Rajiv, would you mind if I installed the libssl-dev package on the
>>build machines? It looks like some of them have it and some don't
>>(which explains why the build randomly started failing two days ago).
>>Or, if there is config management in place, can you please install the
>>packages?
>>
>>Thanks
>>-Todd
>>
>>On Fri, Oct 28, 2011 at 11:05 AM, Todd Lipcon <to...@cloudera.com> wrote:
>>> Not sure exactly what changed on 10/26, but our MR builds are now
>>> failing because they can't find libcrypto.so. The odd thing is that
>>> libcrypto.so is indeed installed on the build boxes (both 32-bit and
>>> 64-bit).
>>>
>>> Any ideas?
>>>
>>> On Wed, Oct 26, 2011 at 5:07 PM, Apache Jenkins Server
>>> <je...@builds.apache.org> wrote:
>>>> See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1175/
>>>>
>>>> ###################################################################################
>>>> ########################## LAST 60 LINES OF THE CONSOLE ###########################
>>>> [...truncated 10036 lines...]
>>>>    [touch] Creating /tmp/null1185950683
>>>>   [delete] Deleting: /tmp/null1185950683
>>>>    [unzip] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar into /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build
>>>>
>>>> check-c++-configure:
>>>>
>>>> create-c++-pipes-configure:
>>>>     [exec] checking for a BSD-compatible install... /usr/bin/install -c
>>>>     [exec] checking whether build environment is sane... yes
>>>>     [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
>>>>     [exec] checking for gawk... no
>>>>     [exec] checking for mawk... mawk
>>>>     [exec] checking whether make sets $(MAKE)... yes
>>>>     [exec] checking for style of include used by make... GNU
>>>>     [exec] checking for gcc... gcc
>>>>     [exec] checking whether the C compiler works... yes
>>>>     [exec] checking for C compiler default output file name... a.out
>>>>     [exec] checking for suffix of executables...
>>>>     [exec] checking whether we are cross compiling... no
>>>>     [exec] checking for suffix of object files... o
>>>>     [exec] checking whether we are using the GNU C compiler... yes
>>>>     [exec] checking whether gcc accepts -g... yes
>>>>     [exec] checking for gcc option to accept ISO C89... none needed
>>>>     [exec] checking dependency style of gcc... gcc3
>>>>     [exec] checking how to run the C preprocessor... gcc -E
>>>>     [exec] checking for grep that handles long lines and -e... /bin/grep
>>>>     [exec] checking for egrep... /bin/grep -E
>>>>     [exec] checking for ANSI C header files... yes
>>>>     [exec] checking for sys/types.h... yes
>>>>     [exec] checking for sys/stat.h... yes
>>>>     [exec] checking for stdlib.h... yes
>>>>     [exec] checking for string.h... yes
>>>>     [exec] checking for memory.h... yes
>>>>     [exec] checking for strings.h... yes
>>>>     [exec] checking for inttypes.h... yes
>>>>     [exec] checking for stdint.h... yes
>>>>     [exec] checking for unistd.h... yes
>>>>     [exec] checking minix/config.h usability... no
>>>>     [exec] checking minix/config.h presence... no
>>>>     [exec] checking for minix/config.h... no
>>>>     [exec] checking whether it is safe to define __EXTENSIONS__... yes
>>>>     [exec] checking for special C compiler options needed for large files... no
>>>>     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
>>>>     [exec] checking pthread.h usability... yes
>>>>     [exec] checking pthread.h presence... yes
>>>>     [exec] checking for pthread.h... yes
>>>>     [exec] checking for pthread_create in -lpthread... yes
>>>>     [exec] checking for HMAC_Init in -lcrypto... no
>>>>     [exec] /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/pipes/configure: line 266: return: please: numeric argument required
>>>>     [exec] configure: error: Cannot find libcrypto.so
>>>>
>>>> BUILD FAILED
>>>> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1916: exec returned: 255
>>>>
>>>> Total time: 29 seconds
>>>> Build step 'Execute shell' marked build as failure
>>>> Recording test results
>>>> Email was triggered for: Failure
>>>> Sending email for trigger: Failure
>>>>
>>>>
>>>>
>>>> ###################################################################################
>>>> ############################## FAILED TESTS (if any) ##############################
>>>> All tests passed
>>>>
>>>
>>>
>>>
>>> --
>>> Todd Lipcon
>>> Software Engineer, Cloudera
>>>
>>
>>
>>
>>--
>>Todd Lipcon
>>Software Engineer, Cloudera
>>
>



-- 
Todd Lipcon
Software Engineer, Cloudera

Re: Hadoop-Mapreduce-trunk-Commit - Build # 1175 - Still Failing

Posted by Rajiv Chittajallu <ra...@yahoo-inc.com>.
sure, go ahead. 

Todd Lipcon wrote on 10/28/11 at 11:12:26 -0700:
>Actually, it looks like the libssl0.9.8 package is installed but not
>the libssl-dev package.
>
>Rajiv, would you mind if I installed the libssl-dev package on the
>build machines? It looks like some of them have it and some don't
>(which explains why the build randomly started failing two days ago).
>Or, if there is config management in place, can you please install the
>packages?
>
>Thanks
>-Todd
>
>On Fri, Oct 28, 2011 at 11:05 AM, Todd Lipcon <to...@cloudera.com> wrote:
>> Not sure exactly what changed on 10/26, but our MR builds are now
>> failing because they can't find libcrypto.so. The odd thing is that
>> libcrypto.so is indeed installed on the build boxes (both 32-bit and
>> 64-bit).
>>
>> Any ideas?
>>
>> On Wed, Oct 26, 2011 at 5:07 PM, Apache Jenkins Server
>> <je...@builds.apache.org> wrote:
>>> See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1175/
>>>
>>> ###################################################################################
>>> ########################## LAST 60 LINES OF THE CONSOLE ###########################
>>> [...truncated 10036 lines...]
>>>    [touch] Creating /tmp/null1185950683
>>>   [delete] Deleting: /tmp/null1185950683
>>>    [unzip] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar into /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build
>>>
>>> check-c++-configure:
>>>
>>> create-c++-pipes-configure:
>>>     [exec] checking for a BSD-compatible install... /usr/bin/install -c
>>>     [exec] checking whether build environment is sane... yes
>>>     [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
>>>     [exec] checking for gawk... no
>>>     [exec] checking for mawk... mawk
>>>     [exec] checking whether make sets $(MAKE)... yes
>>>     [exec] checking for style of include used by make... GNU
>>>     [exec] checking for gcc... gcc
>>>     [exec] checking whether the C compiler works... yes
>>>     [exec] checking for C compiler default output file name... a.out
>>>     [exec] checking for suffix of executables...
>>>     [exec] checking whether we are cross compiling... no
>>>     [exec] checking for suffix of object files... o
>>>     [exec] checking whether we are using the GNU C compiler... yes
>>>     [exec] checking whether gcc accepts -g... yes
>>>     [exec] checking for gcc option to accept ISO C89... none needed
>>>     [exec] checking dependency style of gcc... gcc3
>>>     [exec] checking how to run the C preprocessor... gcc -E
>>>     [exec] checking for grep that handles long lines and -e... /bin/grep
>>>     [exec] checking for egrep... /bin/grep -E
>>>     [exec] checking for ANSI C header files... yes
>>>     [exec] checking for sys/types.h... yes
>>>     [exec] checking for sys/stat.h... yes
>>>     [exec] checking for stdlib.h... yes
>>>     [exec] checking for string.h... yes
>>>     [exec] checking for memory.h... yes
>>>     [exec] checking for strings.h... yes
>>>     [exec] checking for inttypes.h... yes
>>>     [exec] checking for stdint.h... yes
>>>     [exec] checking for unistd.h... yes
>>>     [exec] checking minix/config.h usability... no
>>>     [exec] checking minix/config.h presence... no
>>>     [exec] checking for minix/config.h... no
>>>     [exec] checking whether it is safe to define __EXTENSIONS__... yes
>>>     [exec] checking for special C compiler options needed for large files... no
>>>     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
>>>     [exec] checking pthread.h usability... yes
>>>     [exec] checking pthread.h presence... yes
>>>     [exec] checking for pthread.h... yes
>>>     [exec] checking for pthread_create in -lpthread... yes
>>>     [exec] checking for HMAC_Init in -lcrypto... no
>>>     [exec] /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/pipes/configure: line 266: return: please: numeric argument required
>>>     [exec] configure: error: Cannot find libcrypto.so
>>>
>>> BUILD FAILED
>>> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1916: exec returned: 255
>>>
>>> Total time: 29 seconds
>>> Build step 'Execute shell' marked build as failure
>>> Recording test results
>>> Email was triggered for: Failure
>>> Sending email for trigger: Failure
>>>
>>>
>>>
>>> ###################################################################################
>>> ############################## FAILED TESTS (if any) ##############################
>>> All tests passed
>>>
>>
>>
>>
>> --
>> Todd Lipcon
>> Software Engineer, Cloudera
>>
>
>
>
>-- 
>Todd Lipcon
>Software Engineer, Cloudera
>

Re: Hadoop-Mapreduce-trunk-Commit - Build # 1175 - Still Failing

Posted by Todd Lipcon <to...@cloudera.com>.
Actually, it looks like the libssl0.9.8 package is installed but not
the libssl-dev package.

Rajiv, would you mind if I installed the libssl-dev package on the
build machines? It looks like some of them have it and some don't
(which explains why the build randomly started failing two days ago).
Or, if there is config management in place, can you please install the
packages?

Thanks
-Todd

On Fri, Oct 28, 2011 at 11:05 AM, Todd Lipcon <to...@cloudera.com> wrote:
> Not sure exactly what changed on 10/26, but our MR builds are now
> failing because they can't find libcrypto.so. The odd thing is that
> libcrypto.so is indeed installed on the build boxes (both 32-bit and
> 64-bit).
>
> Any ideas?
>
> On Wed, Oct 26, 2011 at 5:07 PM, Apache Jenkins Server
> <je...@builds.apache.org> wrote:
>> See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1175/
>>
>> ###################################################################################
>> ########################## LAST 60 LINES OF THE CONSOLE ###########################
>> [...truncated 10036 lines...]
>>    [touch] Creating /tmp/null1185950683
>>   [delete] Deleting: /tmp/null1185950683
>>    [unzip] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar into /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build
>>
>> check-c++-configure:
>>
>> create-c++-pipes-configure:
>>     [exec] checking for a BSD-compatible install... /usr/bin/install -c
>>     [exec] checking whether build environment is sane... yes
>>     [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
>>     [exec] checking for gawk... no
>>     [exec] checking for mawk... mawk
>>     [exec] checking whether make sets $(MAKE)... yes
>>     [exec] checking for style of include used by make... GNU
>>     [exec] checking for gcc... gcc
>>     [exec] checking whether the C compiler works... yes
>>     [exec] checking for C compiler default output file name... a.out
>>     [exec] checking for suffix of executables...
>>     [exec] checking whether we are cross compiling... no
>>     [exec] checking for suffix of object files... o
>>     [exec] checking whether we are using the GNU C compiler... yes
>>     [exec] checking whether gcc accepts -g... yes
>>     [exec] checking for gcc option to accept ISO C89... none needed
>>     [exec] checking dependency style of gcc... gcc3
>>     [exec] checking how to run the C preprocessor... gcc -E
>>     [exec] checking for grep that handles long lines and -e... /bin/grep
>>     [exec] checking for egrep... /bin/grep -E
>>     [exec] checking for ANSI C header files... yes
>>     [exec] checking for sys/types.h... yes
>>     [exec] checking for sys/stat.h... yes
>>     [exec] checking for stdlib.h... yes
>>     [exec] checking for string.h... yes
>>     [exec] checking for memory.h... yes
>>     [exec] checking for strings.h... yes
>>     [exec] checking for inttypes.h... yes
>>     [exec] checking for stdint.h... yes
>>     [exec] checking for unistd.h... yes
>>     [exec] checking minix/config.h usability... no
>>     [exec] checking minix/config.h presence... no
>>     [exec] checking for minix/config.h... no
>>     [exec] checking whether it is safe to define __EXTENSIONS__... yes
>>     [exec] checking for special C compiler options needed for large files... no
>>     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
>>     [exec] checking pthread.h usability... yes
>>     [exec] checking pthread.h presence... yes
>>     [exec] checking for pthread.h... yes
>>     [exec] checking for pthread_create in -lpthread... yes
>>     [exec] checking for HMAC_Init in -lcrypto... no
>>     [exec] /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/pipes/configure: line 266: return: please: numeric argument required
>>     [exec] configure: error: Cannot find libcrypto.so
>>
>> BUILD FAILED
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1916: exec returned: 255
>>
>> Total time: 29 seconds
>> Build step 'Execute shell' marked build as failure
>> Recording test results
>> Email was triggered for: Failure
>> Sending email for trigger: Failure
>>
>>
>>
>> ###################################################################################
>> ############################## FAILED TESTS (if any) ##############################
>> All tests passed
>>
>
>
>
> --
> Todd Lipcon
> Software Engineer, Cloudera
>



-- 
Todd Lipcon
Software Engineer, Cloudera

Re: Hadoop-Mapreduce-trunk-Commit - Build # 1175 - Still Failing

Posted by Todd Lipcon <to...@cloudera.com>.
Not sure exactly what changed on 10/26, but our MR builds are now
failing because they can't find libcrypto.so. The odd thing is that
libcrypto.so is indeed installed on the build boxes (both 32-bit and
64-bit).

Any ideas?

On Wed, Oct 26, 2011 at 5:07 PM, Apache Jenkins Server
<je...@builds.apache.org> wrote:
> See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1175/
>
> ###################################################################################
> ########################## LAST 60 LINES OF THE CONSOLE ###########################
> [...truncated 10036 lines...]
>    [touch] Creating /tmp/null1185950683
>   [delete] Deleting: /tmp/null1185950683
>    [unzip] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar into /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build
>
> check-c++-configure:
>
> create-c++-pipes-configure:
>     [exec] checking for a BSD-compatible install... /usr/bin/install -c
>     [exec] checking whether build environment is sane... yes
>     [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
>     [exec] checking for gawk... no
>     [exec] checking for mawk... mawk
>     [exec] checking whether make sets $(MAKE)... yes
>     [exec] checking for style of include used by make... GNU
>     [exec] checking for gcc... gcc
>     [exec] checking whether the C compiler works... yes
>     [exec] checking for C compiler default output file name... a.out
>     [exec] checking for suffix of executables...
>     [exec] checking whether we are cross compiling... no
>     [exec] checking for suffix of object files... o
>     [exec] checking whether we are using the GNU C compiler... yes
>     [exec] checking whether gcc accepts -g... yes
>     [exec] checking for gcc option to accept ISO C89... none needed
>     [exec] checking dependency style of gcc... gcc3
>     [exec] checking how to run the C preprocessor... gcc -E
>     [exec] checking for grep that handles long lines and -e... /bin/grep
>     [exec] checking for egrep... /bin/grep -E
>     [exec] checking for ANSI C header files... yes
>     [exec] checking for sys/types.h... yes
>     [exec] checking for sys/stat.h... yes
>     [exec] checking for stdlib.h... yes
>     [exec] checking for string.h... yes
>     [exec] checking for memory.h... yes
>     [exec] checking for strings.h... yes
>     [exec] checking for inttypes.h... yes
>     [exec] checking for stdint.h... yes
>     [exec] checking for unistd.h... yes
>     [exec] checking minix/config.h usability... no
>     [exec] checking minix/config.h presence... no
>     [exec] checking for minix/config.h... no
>     [exec] checking whether it is safe to define __EXTENSIONS__... yes
>     [exec] checking for special C compiler options needed for large files... no
>     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
>     [exec] checking pthread.h usability... yes
>     [exec] checking pthread.h presence... yes
>     [exec] checking for pthread.h... yes
>     [exec] checking for pthread_create in -lpthread... yes
>     [exec] checking for HMAC_Init in -lcrypto... no
>     [exec] /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/pipes/configure: line 266: return: please: numeric argument required
>     [exec] configure: error: Cannot find libcrypto.so
>
> BUILD FAILED
> /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1916: exec returned: 255
>
> Total time: 29 seconds
> Build step 'Execute shell' marked build as failure
> Recording test results
> Email was triggered for: Failure
> Sending email for trigger: Failure
>
>
>
> ###################################################################################
> ############################## FAILED TESTS (if any) ##############################
> All tests passed
>



-- 
Todd Lipcon
Software Engineer, Cloudera

Hadoop-Mapreduce-trunk-Commit - Build # 1175 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/1175/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 10036 lines...]
    [touch] Creating /tmp/null1185950683
   [delete] Deleting: /tmp/null1185950683
    [unzip] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build/ivy/lib/Hadoop/common/hadoop-hdfs-0.24.0-SNAPSHOT.jar into /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build

check-c++-configure:

create-c++-pipes-configure:
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking whether build environment is sane... yes
     [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
     [exec] checking for gawk... no
     [exec] checking for mawk... mawk
     [exec] checking whether make sets $(MAKE)... yes
     [exec] checking for style of include used by make... GNU
     [exec] checking for gcc... gcc
     [exec] checking whether the C compiler works... yes
     [exec] checking for C compiler default output file name... a.out
     [exec] checking for suffix of executables... 
     [exec] checking whether we are cross compiling... no
     [exec] checking for suffix of object files... o
     [exec] checking whether we are using the GNU C compiler... yes
     [exec] checking whether gcc accepts -g... yes
     [exec] checking for gcc option to accept ISO C89... none needed
     [exec] checking dependency style of gcc... gcc3
     [exec] checking how to run the C preprocessor... gcc -E
     [exec] checking for grep that handles long lines and -e... /bin/grep
     [exec] checking for egrep... /bin/grep -E
     [exec] checking for ANSI C header files... yes
     [exec] checking for sys/types.h... yes
     [exec] checking for sys/stat.h... yes
     [exec] checking for stdlib.h... yes
     [exec] checking for string.h... yes
     [exec] checking for memory.h... yes
     [exec] checking for strings.h... yes
     [exec] checking for inttypes.h... yes
     [exec] checking for stdint.h... yes
     [exec] checking for unistd.h... yes
     [exec] checking minix/config.h usability... no
     [exec] checking minix/config.h presence... no
     [exec] checking for minix/config.h... no
     [exec] checking whether it is safe to define __EXTENSIONS__... yes
     [exec] checking for special C compiler options needed for large files... no
     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
     [exec] checking pthread.h usability... yes
     [exec] checking pthread.h presence... yes
     [exec] checking for pthread.h... yes
     [exec] checking for pthread_create in -lpthread... yes
     [exec] checking for HMAC_Init in -lcrypto... no
     [exec] /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/src/c++/pipes/configure: line 266: return: please: numeric argument required
     [exec] configure: error: Cannot find libcrypto.so

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/hadoop-mapreduce-project/build.xml:1916: exec returned: 255

Total time: 29 seconds
Build step 'Execute shell' marked build as failure
Recording test results
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed