You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Nigel Daley <nd...@yahoo-inc.com> on 2008/09/04 17:48:54 UTC

Hive checkin broke trunk

I've disabled the Hadoop patch process until trunk compilation is fixed.

Nige

Begin forwarded message:

> From: Apache Hudson Server <hu...@hudson.zones.apache.org>
> Date: September 4, 2008 5:43:16 AM PDT
> To: core-dev@hadoop.apache.org
> Subject: Build failed in Hudson: Hadoop-trunk #592
> Reply-To: core-dev@hadoop.apache.org
>
> See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/592/changes
>
> Changes:
>
> [ddas] HADOOP-3866. Adding a file missed in the earlier commit.
>
> [nigel] use the version of GNU grep already on the system
>
> [omalley] HADOOP-4050. Fix TestFailScheduler to use absolute paths  
> for the work
> directory. (Matei Zaharia via omalley)
>
> [omalley] HADOOP-3866. Added sort and multi-job updates in the  
> JobTracker web ui.
> (Craig Weisenfluh via omalley)
>
> ------------------------------------------
> [...truncated 202373 lines...]
>    [junit] Shutting down DataNode 0
>    [junit] 2008-09-04 12:42:30,540 INFO  util.ThreadedServer  
> (ThreadedServer.java:run(656)) - Stopping Acceptor  
> ServerSocket[addr=localhost/127.0.0.1,port=0,localport=62237]
>    [junit] 2008-09-04 12:42:30,540 INFO  http.SocketListener  
> (SocketListener.java:stop(212)) - Stopped SocketListener on  
> 127.0.0.1:62237
>    [junit] 2008-09-04 12:42:30,610 INFO  util.Container  
> (Container.java:stop(156)) - Stopped HttpContext[/static,/static]
>    [junit] 2008-09-04 12:42:30,673 INFO  util.Container  
> (Container.java:stop(156)) - Stopped HttpContext[/logs,/logs]
>    [junit] 2008-09-04 12:42:30,673 INFO  util.Container  
> (Container.java:stop(156)) - Stopped  
> org.mortbay.jetty.servlet.WebApplicationHandler@1bbbafc
>    [junit] 2008-09-04 12:42:30,739 INFO  util.Container  
> (Container.java:stop(156)) - Stopped WebApplicationContext[/,/]
>    [junit] 2008-09-04 12:42:30,739 INFO  util.Container  
> (Container.java:stop(156)) - Stopped org.mortbay.jetty.Server@2f1e75
>    [junit] 2008-09-04 12:42:30,744 INFO  ipc.Server  
> (Server.java:stop(992)) - Stopping server on 62238
>    [junit] 2008-09-04 12:42:30,744 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 0 on 62238: exiting
>    [junit] 2008-09-04 12:42:30,745 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 2 on 62238: exiting
>    [junit] 2008-09-04 12:42:30,745 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 1 on 62238: exiting
>    [junit] 2008-09-04 12:42:30,745 INFO  ipc.Server  
> (Server.java:run(330)) - Stopping IPC Server listener on 62238
>    [junit] 2008-09-04 12:42:30,746 INFO  datanode.DataNode  
> (DataNode.java:shutdown(563)) - Waiting for threadgroup to exit,  
> active threads is 1
>    [junit] 2008-09-04 12:42:30,747 INFO  datanode.DataBlockScanner  
> (DataBlockScanner.java:run(599)) - Exiting DataBlockScanner thread.
>    [junit] 2008-09-04 12:42:30,747 INFO  datanode.DataNode  
> (DataNode.java:run(1119)) - DatanodeRegistration(127.0.0.1:62236,  
> storageID=DS-141150402-140.211.11.106-62236-1220532147971,  
> infoPort=62237, ipcPort=62238):Finishing DataNode in:  
> FSDataset{dirpath='/zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/test/data/dfs/data/data1/current,/ 
> zonestorage/hudson/home/hudson/hudson/jobs/Hadoop-trunk/workspace/ 
> trunk/build/test/data/dfs/data/data2/current'}
>    [junit] 2008-09-04 12:42:30,748 INFO  ipc.Server  
> (Server.java:stop(992)) - Stopping server on 62238
>    [junit] 2008-09-04 12:42:30,748 INFO  datanode.DataNode  
> (DataNode.java:shutdown(563)) - Waiting for threadgroup to exit,  
> active threads is 0
>    [junit] 2008-09-04 12:42:30,749 INFO  util.ThreadedServer  
> (ThreadedServer.java:run(656)) - Stopping Acceptor  
> ServerSocket[addr=localhost/127.0.0.1,port=0,localport=62233]
>    [junit] 2008-09-04 12:42:30,750 INFO  http.SocketListener  
> (SocketListener.java:stop(212)) - Stopped SocketListener on  
> 127.0.0.1:62233
>    [junit] 2008-09-04 12:42:30,813 INFO  util.Container  
> (Container.java:stop(156)) - Stopped HttpContext[/static,/static]
>    [junit] 2008-09-04 12:42:30,875 INFO  util.Container  
> (Container.java:stop(156)) - Stopped HttpContext[/logs,/logs]
>    [junit] 2008-09-04 12:42:30,876 INFO  util.Container  
> (Container.java:stop(156)) - Stopped  
> org.mortbay.jetty.servlet.WebApplicationHandler@c8c7d6
>    [junit] 2008-09-04 12:42:30,936 INFO  util.Container  
> (Container.java:stop(156)) - Stopped WebApplicationContext[/,/]
>    [junit] 2008-09-04 12:42:30,937 INFO  util.Container  
> (Container.java:stop(156)) - Stopped org.mortbay.jetty.Server@688954
>    [junit] 2008-09-04 12:42:30,937 WARN  namenode.FSNamesystem  
> (FSNamesystem.java:run(2204)) - ReplicationMonitor thread received  
> InterruptedException.java.lang.InterruptedException: sleep interrupted
>    [junit] 2008-09-04 12:42:30,937 INFO  namenode.FSNamesystem  
> (FSEditLog.java:printStatistics(931)) - Number of transactions: 12  
> Total time for transactions(ms): 2 Number of syncs: 9 SyncTimes(ms):  
> 245 130
>    [junit] 2008-09-04 12:42:30,968 INFO  ipc.Server  
> (Server.java:stop(992)) - Stopping server on 62232
>    [junit] 2008-09-04 12:42:30,969 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 0 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,969 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 2 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,969 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 1 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,969 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 3 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,970 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 4 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,970 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 5 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,970 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 8 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,970 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 7 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,970 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 6 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,971 INFO  ipc.Server  
> (Server.java:run(330)) - Stopping IPC Server listener on 62232
>    [junit] 2008-09-04 12:42:30,971 INFO  ipc.Server  
> (Server.java:run(920)) - IPC Server handler 9 on 62232: exiting
>    [junit] 2008-09-04 12:42:30,972 INFO  ipc.Server  
> (Server.java:run(502)) - Stopping IPC Server Responder
>    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed:  
> 12.607 sec
>    [junit] Running  
> org.apache.hadoop.security.TestUnixUserGroupInformation
>    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.492  
> sec
>    [junit] Running org.apache.hadoop.util.TestGenericsUtil
>    [junit] 2008-09-04 12:42:32,546 WARN  util.GenericOptionsParser  
> (GenericOptionsParser.java:parseGeneralOptions(318)) - options  
> parsing failed: Missing argument for option:jt
>    [junit] usage: general options are:
>    [junit]  -archives <paths>             comma separated archives  
> to be unarchived
>    [junit]                                on the compute machines.
>    [junit]  -conf <configuration file>    specify an application  
> configuration file
>    [junit]  -D <property=value>           use value for given property
>    [junit]  -files <paths>                comma separated files to  
> be copied to the
>    [junit]                                map reduce cluster
>    [junit]  -fs <local|namenode:port>     specify a namenode
>    [junit]  -jt <local|jobtracker:port>   specify a job tracker
>    [junit]  -libjars <paths>              comma separated jar files  
> to include in the
>    [junit]                                classpath.
>    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.198  
> sec
>    [junit] Running org.apache.hadoop.util.TestIndexedSort
>    [junit] sortRandom seed:  
> -5966691301579468509(org.apache.hadoop.util.QuickSort)
>    [junit] testSorted seed:  
> 138229944466522474(org.apache.hadoop.util.QuickSort)
>    [junit] testAllEqual setting min/max at  
> 366/489(org.apache.hadoop.util.QuickSort)
>    [junit] sortWritable seed:  
> -1723696620242293862(org.apache.hadoop.util.QuickSort)
>    [junit] QuickSort degen cmp/swp:  
> 23252/3713(org.apache.hadoop.util.QuickSort)
>    [junit] sortRandom seed:  
> -5548688146960665304(org.apache.hadoop.util.HeapSort)
>    [junit] testSorted seed:  
> 5415873229771813945(org.apache.hadoop.util.HeapSort)
>    [junit] testAllEqual setting min/max at  
> 26/458(org.apache.hadoop.util.HeapSort)
>    [junit] sortWritable seed:  
> -4669593479856437178(org.apache.hadoop.util.HeapSort)
>    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 1.473  
> sec
>    [junit] Running org.apache.hadoop.util.TestReflectionUtils
>    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.788  
> sec
>    [junit] Running org.apache.hadoop.util.TestShell
>    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.254  
> sec
>    [junit] Running org.apache.hadoop.util.TestStringUtils
>    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.069  
> sec
>
> compile-libhdfs:
>
> compile-contrib:
>
> compile:
> Trying to override old definition of task macro_tar
>
> init:
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/chukwa
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/chukwa/test
>
> compile:
>    [javac] Compiling 113 source files to /zonestorage/hudson/home/ 
> hudson/hudson/jobs/Hadoop-trunk/workspace/trunk/build/contrib/chukwa
>    [javac] Note: /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop- 
> trunk/workspace/trunk/src/contrib/chukwa/src/java/org/apache/hadoop/ 
> chukwa/extraction/engine/ChukwaRecordJT.java uses unchecked or  
> unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>
> check-contrib:
>
> init:
>     [echo] contrib: datajoin
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/datajoin
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/datajoin/classes
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/datajoin/test
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/datajoin/examples
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/datajoin/test/logs
>
> init-contrib:
>
> compile:
>     [echo] contrib: datajoin
>    [javac] Compiling 7 source files to /zonestorage/hudson/home/ 
> hudson/hudson/jobs/Hadoop-trunk/workspace/trunk/build/contrib/ 
> datajoin/classes
>    [javac] Note: /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop- 
> trunk/workspace/trunk/src/contrib/data_join/src/java/org/apache/ 
> hadoop/contrib/utils/join/DataJoinJob.java uses or overrides a  
> deprecated API.
>    [javac] Note: Recompile with -Xlint:deprecation for details.
>    [javac] Note: Some input files use unchecked or unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>
> check-contrib:
>     [echo] eclipse.home unset: skipping eclipse plugin
>
> init:
>
> compile:
>
> check-contrib:
>
> init:
>     [echo] contrib: failmon
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/failmon
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/failmon/classes
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/failmon/test
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/failmon/examples
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/failmon/test/logs
>
> init-contrib:
>
> compile:
>     [echo] contrib: failmon
>    [javac] Compiling 21 source files to /zonestorage/hudson/home/ 
> hudson/hudson/jobs/Hadoop-trunk/workspace/trunk/build/contrib/ 
> failmon/classes
>    [javac] Note: /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop- 
> trunk/workspace/trunk/src/contrib/failmon/src/java/org/apache/hadoop/ 
> contrib/failmon/HDFSMerger.java uses or overrides a deprecated API.
>    [javac] Note: Recompile with -Xlint:deprecation for details.
>
> check-contrib:
>
> init:
>     [echo] contrib: fairscheduler
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/fairscheduler
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/fairscheduler/classes
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/fairscheduler/test
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/fairscheduler/examples
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/fairscheduler/test/logs
>
> init-contrib:
>
> compile:
>     [echo] contrib: fairscheduler
>    [javac] Compiling 13 source files to /zonestorage/hudson/home/ 
> hudson/hudson/jobs/Hadoop-trunk/workspace/trunk/build/contrib/ 
> fairscheduler/classes
>
> check-libhdfs-fuse:
>
> check-libhdfs-exists:
>
> compile:
>
> compile:
>
> init:
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/common
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/common/classes
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/jexl/classes
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/common/test
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/common/test/src
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/common/test/classes
>
> compile:
>     [echo] Compiling: common
>    [javac] Compiling 1 source file to /zonestorage/hudson/home/ 
> hudson/hudson/jobs/Hadoop-trunk/workspace/trunk/build/contrib/hive/ 
> common/classes
>
> init:
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/serde
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/serde/classes
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/serde/test
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/serde/test/src
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/serde/test/classes
>
> compile:
>     [echo] Compiling: serde
>    [javac] Compiling 92 source files to /zonestorage/hudson/home/ 
> hudson/hudson/jobs/Hadoop-trunk/workspace/trunk/build/contrib/hive/ 
> serde/classes
>    [javac] Note: Some input files use unchecked or unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>
> init:
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/metastore
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/metastore/classes
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/metastore/test
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/metastore/test/src
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/metastore/test/classes
>
> model-compile:
>    [javac] Compiling 8 source files to /zonestorage/hudson/home/ 
> hudson/hudson/jobs/Hadoop-trunk/workspace/trunk/build/contrib/hive/ 
> metastore/classes
>     [copy] Copying 1 file to /zonestorage/hudson/home/hudson/hudson/ 
> jobs/Hadoop-trunk/workspace/trunk/build/contrib/hive/metastore/classes
>
> core-compile:
>     [echo] Compiling:
>    [javac] Compiling 38 source files to /zonestorage/hudson/home/ 
> hudson/hudson/jobs/Hadoop-trunk/workspace/trunk/build/contrib/hive/ 
> metastore/classes
>    [javac] Note: /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop- 
> trunk/workspace/trunk/src/contrib/hive/metastore/src/java/org/apache/ 
> hadoop/hive/metastore/ObjectStore.java uses unchecked or unsafe  
> operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>
> model-enhance:
>     [echo] Enhancing model classes with JPOX stuff....
>     [java] JPOX Enhancer (version 1.2.2) : Enhancement of classes
>     [java]
>     [java] JPOX Enhancer completed with success for 8 classes.  
> Timings : input=211 ms, enhance=356 ms, total=567 ms. Consult the  
> log for full details
>
> compile:
>
> init:
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/ql
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/ql/classes
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/ql/test
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/ql/test/src
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/ql/test/classes
>
> ql-init:
>    [mkdir] Created dir: /zonestorage/hudson/home/hudson/hudson/jobs/ 
> Hadoop-trunk/workspace/trunk/build/contrib/hive/ql/gen-java/org/ 
> apache/hadoop/hive/ql/parse
>
> build-grammar:
>     [echo] Building Grammar /zonestorage/hudson/home/hudson/hudson/ 
> jobs/Hadoop-trunk/workspace/trunk/src/contrib/hive/ql/src/java/org/ 
> apache/hadoop/hive/ql/parse/Hive.g  ....
>     [java] ANTLR Parser Generator  Version 3.0.1 (August 13, 2007)   
> 1989-2007
>
> compile:
>     [echo] Compiling: ql
>    [javac] Compiling 189 source files to /zonestorage/hudson/home/ 
> hudson/hudson/jobs/Hadoop-trunk/workspace/trunk/build/contrib/hive/ 
> ql/classes
>    [javac] /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop-trunk/ 
> workspace/trunk/src/contrib/hive/ql/src/java/org/apache/hadoop/hive/ 
> ql/exec/Utilities.java:361: cannot find symbol
>    [javac] symbol  : method  
> abbreviate(java.lang.String,int,int,java.lang.String)
>    [javac] location: class org.apache.commons.lang.WordUtils
>    [javac]     String suffix = WordUtils.abbreviate(rev, 0,  
> suffixlength, "");
>    [javac]                              ^
>    [javac] Note: /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop- 
> trunk/workspace/trunk/src/contrib/hive/ql/src/java/org/apache/hadoop/ 
> hive/ql/exec/ExecDriver.java uses or overrides a deprecated API.
>    [javac] Note: Recompile with -Xlint:deprecation for details.
>    [javac] Note: Some input files use unchecked or unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>    [javac] 1 error
>
> BUILD FAILED
> /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop-trunk/workspace/ 
> trunk/build.xml:415: The following error occurred while executing  
> this line:
> /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop-trunk/workspace/ 
> trunk/src/contrib/build.xml:30: The following error occurred while  
> executing this line:
> /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop-trunk/workspace/ 
> trunk/src/contrib/hive/build.xml:67: The following error occurred  
> while executing this line:
> /zonestorage/hudson/home/hudson/hudson/jobs/Hadoop-trunk/workspace/ 
> trunk/src/contrib/hive/ql/build.xml:95: Compile failed; see the  
> compiler error output for details.
>
> Total time: 85 minutes 32 seconds
> Recording fingerprints
> Publishing Javadoc
> Recording test results
>


Re: Hive checkin broke trunk

Posted by Nigel Daley <nd...@yahoo-inc.com>.
Ok, after a couple hours banging against this, it turns out that the  
rat-0.5.1.jar (used for the release auditing -- license header checks  
and the like) which we copy into the hadoop lib directory contains an  
older version of org/apache/commons/lang/WordUtils.class.  I filed  
HADOOP-4074 to fix this.  In the meantime, I've commented out release  
audit from the test patch process and no longer copy the rat jar into  
the hadoop lib dir.

n.

On Sep 4, 2008, at 11:19 AM, Ashish Thusoo wrote:

> Looks like hadoopQA is picking up some other version of commons-lang.
>
> I just checked out a fresh version of trunk and was able to  
> successfully
> compile it in my environment. Can you check if the classpath in your
> environment has a different version of common-lang.
>
> Ashish
>
> -----Original Message-----
> From: Dhruba Borthakur [mailto:dhruba@gmail.com]
> Sent: Thursday, September 04, 2008 11:06 AM
> To: core-dev@hadoop.apache.org
> Subject: Re: Hive checkin broke trunk
>
> The compilation should use the version 2.4 of the apache common- 
> lang.jar
> file and it shoudl be picked up from
> src/contrib/hive/lib/commons-lang-2.4.jar. I am wondering if HadoopQA
> test process is picking up an older version of this jar.
>
> -dhruba
>
> On Thu, Sep 4, 2008 at 10:20 AM, Owen O'Malley <om...@apache.org>
> wrote:
>>
>> On Sep 4, 2008, at 8:48 AM, Nigel Daley wrote:
>>
>>> I've disabled the Hadoop patch process until trunk compilation is
> fixed.
>>
>> I can't get the TestUnixUserGroupInformation to fail for me on  
>> current
>
>> trunk. I've tested it on Linux and on hudson.zones.apache.org.
>>
>> -- Owen
>>


RE: Hive checkin broke trunk

Posted by Ashish Thusoo <at...@facebook.com>.
Looks like hadoopQA is picking up some other version of commons-lang.

I just checked out a fresh version of trunk and was able to successfully
compile it in my environment. Can you check if the classpath in your
environment has a different version of common-lang.

Ashish 

-----Original Message-----
From: Dhruba Borthakur [mailto:dhruba@gmail.com] 
Sent: Thursday, September 04, 2008 11:06 AM
To: core-dev@hadoop.apache.org
Subject: Re: Hive checkin broke trunk

The compilation should use the version 2.4 of the apache common-lang.jar
file and it shoudl be picked up from
src/contrib/hive/lib/commons-lang-2.4.jar. I am wondering if HadoopQA
test process is picking up an older version of this jar.

-dhruba

On Thu, Sep 4, 2008 at 10:20 AM, Owen O'Malley <om...@apache.org>
wrote:
>
> On Sep 4, 2008, at 8:48 AM, Nigel Daley wrote:
>
>> I've disabled the Hadoop patch process until trunk compilation is
fixed.
>
> I can't get the TestUnixUserGroupInformation to fail for me on current

> trunk. I've tested it on Linux and on hudson.zones.apache.org.
>
> -- Owen
>

Re: Hive checkin broke trunk

Posted by Dhruba Borthakur <dh...@gmail.com>.
The compilation should use the version 2.4 of the apache
common-lang.jar file and it shoudl be picked up from
src/contrib/hive/lib/commons-lang-2.4.jar. I am wondering if HadoopQA
test process is picking up an older version of this jar.

-dhruba

On Thu, Sep 4, 2008 at 10:20 AM, Owen O'Malley <om...@apache.org> wrote:
>
> On Sep 4, 2008, at 8:48 AM, Nigel Daley wrote:
>
>> I've disabled the Hadoop patch process until trunk compilation is fixed.
>
> I can't get the TestUnixUserGroupInformation to fail for me on current
> trunk. I've tested it on Linux and on hudson.zones.apache.org.
>
> -- Owen
>

Re: Hive checkin broke trunk

Posted by Owen O'Malley <om...@apache.org>.
On Sep 4, 2008, at 8:48 AM, Nigel Daley wrote:

> I've disabled the Hadoop patch process until trunk compilation is  
> fixed.

I can't get the TestUnixUserGroupInformation to fail for me on current  
trunk. I've tested it on Linux and on hudson.zones.apache.org.

-- Owen