You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@pig.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2012/01/12 23:17:55 UTC

Build failed in Jenkins: Pig-trunk #1169

See <https://builds.apache.org/job/Pig-trunk/1169/changes>

Changes:

[daijy] PIG-2431: Upgrade bundled hadoop version to 1.0.0

[dvryaboy] PIG-2468: Speed up TestBuiltin

------------------------------------------
[...truncated 6943 lines...]
 [findbugs]   org.mozilla.javascript.NativeJavaObject
 [findbugs]   jline.ConsoleReaderInputStream
 [findbugs]   org.apache.log4j.PropertyConfigurator
 [findbugs]   org.apache.hadoop.mapred.TaskID
 [findbugs]   org.apache.commons.cli.CommandLine
 [findbugs]   org.python.core.Py
 [findbugs]   org.apache.hadoop.io.BooleanWritable$Comparator
 [findbugs]   org.apache.hadoop.io.LongWritable
 [findbugs]   org.antlr.runtime.BitSet
 [findbugs]   org.apache.hadoop.mapred.jobcontrol.Job
 [findbugs]   org.apache.hadoop.hbase.filter.CompareFilter$CompareOp
 [findbugs]   org.apache.hadoop.io.file.tfile.TFile$Reader
 [findbugs]   org.mozilla.javascript.NativeFunction
 [findbugs]   org.apache.hadoop.mapreduce.Counter
 [findbugs]   org.codehaus.jackson.JsonEncoding
 [findbugs]   org.codehaus.jackson.JsonParseException
 [findbugs]   org.python.core.PyCode
 [findbugs]   com.jcraft.jsch.HostKey
 [findbugs]   org.apache.hadoop.hbase.filter.Filter
 [findbugs]   org.apache.commons.logging.Log
 [findbugs]   com.google.common.util.concurrent.ListenableFuture
 [findbugs]   org.apache.hadoop.util.RunJar
 [findbugs]   org.apache.hadoop.mapred.Counters$Group
 [findbugs]   com.jcraft.jsch.ChannelExec
 [findbugs]   org.apache.hadoop.hbase.util.Base64
 [findbugs]   org.antlr.runtime.TokenStream
 [findbugs]   com.google.common.util.concurrent.CheckedFuture
 [findbugs]   org.apache.hadoop.io.file.tfile.TFile$Reader$Scanner$Entry
 [findbugs]   org.apache.hadoop.fs.FSDataInputStream
 [findbugs]   org.python.core.PyObject
 [findbugs]   jline.History
 [findbugs]   org.apache.hadoop.io.BooleanWritable
 [findbugs]   org.apache.log4j.Logger
 [findbugs]   org.apache.hadoop.hbase.filter.FamilyFilter
 [findbugs]   org.antlr.runtime.IntStream
 [findbugs]   org.apache.hadoop.util.ReflectionUtils
 [findbugs]   org.apache.hadoop.fs.ContentSummary
 [findbugs]   org.python.core.PyTuple
 [findbugs]   org.apache.hadoop.conf.Configuration
 [findbugs]   org.apache.hadoop.mapreduce.lib.input.FileSplit
 [findbugs]   org.apache.hadoop.mapred.Counters$Counter
 [findbugs]   com.jcraft.jsch.Channel
 [findbugs]   org.apache.hadoop.mapred.JobPriority
 [findbugs]   org.apache.commons.cli.Options
 [findbugs]   org.apache.hadoop.mapred.JobID
 [findbugs]   org.apache.hadoop.util.bloom.BloomFilter
 [findbugs]   org.python.core.PyFrame
 [findbugs]   org.apache.hadoop.hbase.filter.CompareFilter
 [findbugs]   org.apache.hadoop.util.VersionInfo
 [findbugs]   org.python.core.PyString
 [findbugs]   org.apache.hadoop.io.Text$Comparator
 [findbugs]   org.antlr.runtime.MismatchedSetException
 [findbugs]   org.apache.hadoop.io.BytesWritable
 [findbugs]   org.apache.hadoop.fs.FsShell
 [findbugs]   org.mozilla.javascript.ImporterTopLevel
 [findbugs]   org.apache.hadoop.hbase.mapreduce.TableOutputFormat
 [findbugs]   org.apache.hadoop.mapred.TaskReport
 [findbugs]   org.antlr.runtime.tree.RewriteRuleSubtreeStream
 [findbugs]   org.apache.commons.cli.HelpFormatter
 [findbugs]   org.mozilla.javascript.NativeObject
 [findbugs]   org.apache.hadoop.hbase.HConstants
 [findbugs]   org.apache.hadoop.io.serializer.Deserializer
 [findbugs]   org.antlr.runtime.FailedPredicateException
 [findbugs]   org.apache.hadoop.io.compress.CompressionCodec
 [findbugs]   org.apache.hadoop.fs.FileStatus
 [findbugs]   org.apache.hadoop.hbase.client.Result
 [findbugs]   org.apache.hadoop.mapreduce.JobContext
 [findbugs]   org.codehaus.jackson.JsonGenerator
 [findbugs]   org.apache.hadoop.mapreduce.TaskAttemptContext
 [findbugs]   org.apache.hadoop.io.BytesWritable$Comparator
 [findbugs]   org.apache.hadoop.io.LongWritable$Comparator
 [findbugs]   org.codehaus.jackson.map.util.LRUMap
 [findbugs]   org.apache.hadoop.hbase.util.Bytes
 [findbugs]   org.antlr.runtime.MismatchedTokenException
 [findbugs]   org.codehaus.jackson.JsonParser
 [findbugs]   com.jcraft.jsch.UserInfo
 [findbugs]   org.python.core.PyException
 [findbugs]   org.apache.commons.cli.ParseException
 [findbugs]   org.apache.hadoop.io.compress.CompressionOutputStream
 [findbugs]   org.apache.hadoop.hbase.filter.WritableByteArrayComparable
 [findbugs]   org.antlr.runtime.tree.CommonTreeNodeStream
 [findbugs]   org.apache.log4j.Level
 [findbugs]   org.apache.hadoop.hbase.client.Scan
 [findbugs]   org.apache.hadoop.mapreduce.Job
 [findbugs]   com.google.common.util.concurrent.Futures
 [findbugs]   org.apache.commons.logging.LogFactory
 [findbugs]   org.apache.commons.codec.binary.Base64
 [findbugs]   org.codehaus.jackson.map.ObjectMapper
 [findbugs]   org.apache.hadoop.fs.FileSystem
 [findbugs]   org.apache.hadoop.hbase.filter.FilterList$Operator
 [findbugs]   org.apache.hadoop.hbase.io.ImmutableBytesWritable
 [findbugs]   org.apache.hadoop.io.serializer.SerializationFactory
 [findbugs]   org.antlr.runtime.tree.TreeAdaptor
 [findbugs]   org.apache.hadoop.mapred.RunningJob
 [findbugs]   org.antlr.runtime.CommonTokenStream
 [findbugs]   org.apache.hadoop.io.DataInputBuffer
 [findbugs]   org.apache.hadoop.io.file.tfile.TFile
 [findbugs]   org.apache.commons.cli.GnuParser
 [findbugs]   org.mozilla.javascript.Context
 [findbugs]   org.apache.hadoop.io.FloatWritable
 [findbugs]   org.antlr.runtime.tree.RewriteEarlyExitException
 [findbugs]   org.apache.hadoop.hbase.HBaseConfiguration
 [findbugs]   org.codehaus.jackson.JsonGenerationException
 [findbugs]   org.apache.hadoop.mapreduce.TaskInputOutputContext
 [findbugs]   org.apache.hadoop.io.compress.GzipCodec
 [findbugs]   org.apache.hadoop.mapred.jobcontrol.JobControl
 [findbugs]   org.antlr.runtime.BaseRecognizer
 [findbugs]   org.apache.hadoop.fs.FileUtil
 [findbugs]   org.apache.hadoop.fs.Path
 [findbugs]   org.apache.hadoop.hbase.client.Put
 [findbugs]   org.apache.hadoop.io.file.tfile.TFile$Writer
 [findbugs]   jline.ConsoleReader
 [findbugs]   com.google.common.collect.Lists
 [findbugs]   org.apache.hadoop.mapreduce.MapContext
 [findbugs]   org.python.core.PyJavaPackage
 [findbugs]   org.apache.hadoop.hbase.filter.ColumnPrefixFilter
 [findbugs]   org.python.core.PyStringMap
 [findbugs]   org.apache.hadoop.mapreduce.TaskID
 [findbugs]   org.apache.hadoop.hbase.client.HTable
 [findbugs]   org.apache.hadoop.io.FloatWritable$Comparator
 [findbugs]   org.apache.zookeeper.ZooKeeper
 [findbugs]   org.codehaus.jackson.map.JsonMappingException
 [findbugs]   org.python.core.PyFunction
 [findbugs]   org.antlr.runtime.TokenSource
 [findbugs]   com.jcraft.jsch.ChannelDirectTCPIP
 [findbugs]   com.jcraft.jsch.JSchException
 [findbugs]   org.python.util.PythonInterpreter
 [findbugs]   org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
 [findbugs]   org.python.core.PyInteger
 [findbugs]   org.apache.hadoop.mapred.JobConf
 [findbugs]   org.apache.hadoop.util.bloom.Key
 [findbugs]   org.apache.hadoop.io.Text
 [findbugs]   org.antlr.runtime.NoViableAltException
 [findbugs]   org.apache.hadoop.util.GenericOptionsParser
 [findbugs]   org.apache.hadoop.mapreduce.JobID
 [findbugs]   org.apache.hadoop.mapreduce.TaskAttemptID
 [findbugs]   org.apache.hadoop.filecache.DistributedCache
 [findbugs]   org.apache.hadoop.fs.FSDataOutputStream
 [findbugs]   org.python.core.PyList
 [findbugs]   org.antlr.runtime.tree.TreeNodeStream
 [findbugs]   org.apache.hadoop.hbase.filter.BinaryComparator
 [findbugs]   dk.brics.automaton.RegExp
 [findbugs]   org.mozilla.javascript.Scriptable
 [findbugs]   org.mozilla.javascript.EcmaError
 [findbugs]   org.apache.hadoop.io.serializer.Serializer
 [findbugs]   org.apache.hadoop.util.bloom.Filter
 [findbugs]   org.python.core.PyNone
 [findbugs]   org.mozilla.javascript.Function
 [findbugs]   org.python.core.PySystemState
 [findbugs]   org.antlr.runtime.RecognizerSharedState
 [findbugs]   org.codehaus.jackson.JsonFactory
 [findbugs]   org.antlr.runtime.EarlyExitException
 [findbugs]   org.apache.hadoop.hdfs.DistributedFileSystem
 [findbugs]   org.apache.hadoop.util.LineReader
 [findbugs] Warnings generated: 18
 [findbugs] Missing classes: 230
 [findbugs] Calculating exit code...
 [findbugs] Setting 'missing class' flag (2)
 [findbugs] Setting 'bugs found' flag (1)
 [findbugs] Exit code set to: 3
 [findbugs] Java Result: 3
 [findbugs] Classes needed for analysis were missing
 [findbugs] Output saved to <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/findbugs/pig-findbugs-report.xml>
     [xslt] Processing <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/findbugs/pig-findbugs-report.xml> to <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/findbugs/pig-findbugs-report.html>
     [xslt] Loading stylesheet /home/jenkins/tools/findbugs/latest/src/xsl/default.xsl

BUILD SUCCESSFUL
Total time: 7 minutes 55 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================




======================================================================
======================================================================
CLEAN: cleaning workspace
======================================================================
======================================================================


Buildfile: build.xml

clean:
   [delete] Deleting directory <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen>
   [delete] Deleting directory <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/docs/build>
   [delete] Deleting directory <https://builds.apache.org/job/Pig-trunk/ws/trunk/build>
   [delete] Deleting directory <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser>
   [delete] Deleting: <https://builds.apache.org/job/Pig-trunk/ws/trunk/pig.jar>
   [delete] Deleting: <https://builds.apache.org/job/Pig-trunk/ws/trunk/pig-withouthadoop.jar>

clean:

clean:

BUILD SUCCESSFUL
Total time: 0 seconds


======================================================================
======================================================================
ANALYSIS: ant -Drun.clover=true -Dclover.home=/homes/hudson/tools/clover/latest clover test-commit generate-clover-reports -Dtest.junit.output.format=xml -Dtest.output=yes -Dversion=${BUILD_ID} -Dfindbugs.home=$FINDBUGS_HOME -Djava5.home=$JAVA5_HOME -Dforrest.home=$FORREST_HOME -Dclover.home=$CLOVER_HOME -Declipse.home=$ECLIPSE_HOME
======================================================================
======================================================================


Buildfile: build.xml

clover.setup:
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/clover/db>
[clover-setup] Clover Version 3.1.0, built on May 31 2011 (build-821)
[clover-setup] Loaded from: /home/jenkins/tools/clover/latest/lib/clover.jar

BUILD FAILED
java.lang.RuntimeException: Clover upgrades for your license ended December 14 2010, and this version of Clover was built May 31 2011. Please visit http://www.atlassian.com/clover/renew for information on upgrading your license.
	at com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:103)
	at com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:25)
	at com.cenqua.clover.tasks.AbstractCloverTask.execute(AbstractCloverTask.java:52)
	at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
	at org.apache.tools.ant.Task.perform(Task.java:348)
	at org.apache.tools.ant.Target.execute(Target.java:357)
	at org.apache.tools.ant.Target.performTasks(Target.java:385)
	at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
	at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
	at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
	at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
	at org.apache.tools.ant.Main.runBuild(Main.java:758)
	at org.apache.tools.ant.Main.startAnt(Main.java:217)
	at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
	at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)

Total time: 1 second
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording test results
Publishing Javadoc
Archiving artifacts
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure


Build failed in Jenkins: Pig-trunk #1175

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Pig-trunk/1175/changes>

Changes:

[daijy] PIG-2347: Fix Pig Unit tests for hadoop 23 (PIG-2347-4.patch)

[daijy] PIG-2477: TestBuiltin testLFText/testSFPig failing against 23 due to invalid test setup -- InvalidInputException

[daijy] PIG-2477: TestBuiltin testLFText/testSFPig failing against 23 due to invalid test setup -- InvalidInputException

------------------------------------------
[...truncated 6984 lines...]
 [findbugs]   jline.ConsoleReaderInputStream
 [findbugs]   org.apache.log4j.PropertyConfigurator
 [findbugs]   org.apache.hadoop.mapred.TaskID
 [findbugs]   org.apache.commons.cli.CommandLine
 [findbugs]   org.python.core.Py
 [findbugs]   org.apache.hadoop.io.BooleanWritable$Comparator
 [findbugs]   org.apache.hadoop.io.LongWritable
 [findbugs]   org.antlr.runtime.BitSet
 [findbugs]   org.apache.hadoop.mapred.jobcontrol.Job
 [findbugs]   org.apache.hadoop.hbase.filter.CompareFilter$CompareOp
 [findbugs]   org.apache.hadoop.io.file.tfile.TFile$Reader
 [findbugs]   org.mozilla.javascript.NativeFunction
 [findbugs]   org.apache.hadoop.mapreduce.Counter
 [findbugs]   org.codehaus.jackson.JsonEncoding
 [findbugs]   org.codehaus.jackson.JsonParseException
 [findbugs]   org.python.core.PyCode
 [findbugs]   com.jcraft.jsch.HostKey
 [findbugs]   org.apache.hadoop.hbase.filter.Filter
 [findbugs]   org.apache.commons.logging.Log
 [findbugs]   com.google.common.util.concurrent.ListenableFuture
 [findbugs]   org.apache.hadoop.util.RunJar
 [findbugs]   org.apache.hadoop.mapred.Counters$Group
 [findbugs]   com.jcraft.jsch.ChannelExec
 [findbugs]   org.apache.hadoop.hbase.util.Base64
 [findbugs]   org.antlr.runtime.TokenStream
 [findbugs]   com.google.common.util.concurrent.CheckedFuture
 [findbugs]   org.apache.hadoop.io.file.tfile.TFile$Reader$Scanner$Entry
 [findbugs]   org.apache.hadoop.fs.FSDataInputStream
 [findbugs]   org.python.core.PyObject
 [findbugs]   jline.History
 [findbugs]   org.apache.hadoop.io.BooleanWritable
 [findbugs]   org.apache.log4j.Logger
 [findbugs]   org.apache.hadoop.hbase.filter.FamilyFilter
 [findbugs]   org.antlr.runtime.IntStream
 [findbugs]   org.apache.hadoop.util.ReflectionUtils
 [findbugs]   org.apache.hadoop.fs.ContentSummary
 [findbugs]   org.python.core.PyTuple
 [findbugs]   org.apache.hadoop.conf.Configuration
 [findbugs]   com.google.common.base.Joiner
 [findbugs]   org.apache.hadoop.mapreduce.lib.input.FileSplit
 [findbugs]   org.apache.hadoop.mapred.Counters$Counter
 [findbugs]   com.jcraft.jsch.Channel
 [findbugs]   org.apache.hadoop.mapred.JobPriority
 [findbugs]   org.apache.commons.cli.Options
 [findbugs]   org.apache.hadoop.mapred.JobID
 [findbugs]   org.apache.hadoop.util.bloom.BloomFilter
 [findbugs]   org.python.core.PyFrame
 [findbugs]   org.apache.hadoop.hbase.filter.CompareFilter
 [findbugs]   org.apache.hadoop.util.VersionInfo
 [findbugs]   org.python.core.PyString
 [findbugs]   org.apache.hadoop.io.Text$Comparator
 [findbugs]   org.antlr.runtime.MismatchedSetException
 [findbugs]   org.apache.hadoop.io.BytesWritable
 [findbugs]   org.apache.hadoop.fs.FsShell
 [findbugs]   org.mozilla.javascript.ImporterTopLevel
 [findbugs]   org.apache.hadoop.hbase.mapreduce.TableOutputFormat
 [findbugs]   org.apache.hadoop.mapred.TaskReport
 [findbugs]   org.antlr.runtime.tree.RewriteRuleSubtreeStream
 [findbugs]   org.apache.commons.cli.HelpFormatter
 [findbugs]   org.mozilla.javascript.NativeObject
 [findbugs]   org.apache.hadoop.hbase.HConstants
 [findbugs]   org.apache.hadoop.io.serializer.Deserializer
 [findbugs]   org.antlr.runtime.FailedPredicateException
 [findbugs]   org.apache.hadoop.io.compress.CompressionCodec
 [findbugs]   org.apache.hadoop.fs.FileStatus
 [findbugs]   org.apache.hadoop.hbase.client.Result
 [findbugs]   org.apache.hadoop.mapreduce.JobContext
 [findbugs]   org.codehaus.jackson.JsonGenerator
 [findbugs]   org.apache.hadoop.mapreduce.TaskAttemptContext
 [findbugs]   org.apache.hadoop.io.BytesWritable$Comparator
 [findbugs]   org.apache.hadoop.io.LongWritable$Comparator
 [findbugs]   org.codehaus.jackson.map.util.LRUMap
 [findbugs]   org.apache.hadoop.hbase.util.Bytes
 [findbugs]   org.antlr.runtime.MismatchedTokenException
 [findbugs]   org.codehaus.jackson.JsonParser
 [findbugs]   com.jcraft.jsch.UserInfo
 [findbugs]   org.python.core.PyException
 [findbugs]   org.apache.commons.cli.ParseException
 [findbugs]   org.apache.hadoop.io.compress.CompressionOutputStream
 [findbugs]   org.apache.hadoop.hbase.filter.WritableByteArrayComparable
 [findbugs]   org.antlr.runtime.tree.CommonTreeNodeStream
 [findbugs]   org.apache.log4j.Level
 [findbugs]   org.apache.hadoop.hbase.client.Scan
 [findbugs]   org.apache.hadoop.mapreduce.Job
 [findbugs]   com.google.common.util.concurrent.Futures
 [findbugs]   org.apache.commons.logging.LogFactory
 [findbugs]   org.apache.commons.codec.binary.Base64
 [findbugs]   org.codehaus.jackson.map.ObjectMapper
 [findbugs]   org.apache.hadoop.fs.FileSystem
 [findbugs]   org.apache.hadoop.hbase.filter.FilterList$Operator
 [findbugs]   org.apache.hadoop.hbase.io.ImmutableBytesWritable
 [findbugs]   org.apache.hadoop.io.serializer.SerializationFactory
 [findbugs]   org.antlr.runtime.tree.TreeAdaptor
 [findbugs]   org.apache.hadoop.mapred.RunningJob
 [findbugs]   org.antlr.runtime.CommonTokenStream
 [findbugs]   org.apache.hadoop.io.DataInputBuffer
 [findbugs]   org.apache.hadoop.io.file.tfile.TFile
 [findbugs]   org.apache.commons.cli.GnuParser
 [findbugs]   org.mozilla.javascript.Context
 [findbugs]   org.apache.hadoop.io.FloatWritable
 [findbugs]   org.antlr.runtime.tree.RewriteEarlyExitException
 [findbugs]   org.apache.hadoop.hbase.HBaseConfiguration
 [findbugs]   org.codehaus.jackson.JsonGenerationException
 [findbugs]   org.apache.hadoop.mapreduce.TaskInputOutputContext
 [findbugs]   org.apache.hadoop.io.compress.GzipCodec
 [findbugs]   org.apache.hadoop.mapred.jobcontrol.JobControl
 [findbugs]   org.antlr.runtime.BaseRecognizer
 [findbugs]   org.apache.hadoop.fs.FileUtil
 [findbugs]   org.apache.hadoop.fs.Path
 [findbugs]   org.apache.hadoop.hbase.client.Put
 [findbugs]   org.apache.hadoop.io.file.tfile.TFile$Writer
 [findbugs]   jline.ConsoleReader
 [findbugs]   com.google.common.collect.Lists
 [findbugs]   org.apache.hadoop.mapreduce.MapContext
 [findbugs]   org.python.core.PyJavaPackage
 [findbugs]   org.apache.hadoop.hbase.filter.ColumnPrefixFilter
 [findbugs]   org.python.core.PyStringMap
 [findbugs]   org.apache.hadoop.mapreduce.TaskID
 [findbugs]   org.apache.hadoop.hbase.client.HTable
 [findbugs]   org.apache.hadoop.io.FloatWritable$Comparator
 [findbugs]   org.apache.zookeeper.ZooKeeper
 [findbugs]   org.codehaus.jackson.map.JsonMappingException
 [findbugs]   org.python.core.PyFunction
 [findbugs]   org.antlr.runtime.TokenSource
 [findbugs]   com.jcraft.jsch.ChannelDirectTCPIP
 [findbugs]   com.jcraft.jsch.JSchException
 [findbugs]   org.python.util.PythonInterpreter
 [findbugs]   org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
 [findbugs]   org.python.core.PyInteger
 [findbugs]   org.apache.hadoop.mapred.JobConf
 [findbugs]   org.apache.hadoop.util.bloom.Key
 [findbugs]   org.apache.hadoop.io.Text
 [findbugs]   org.antlr.runtime.NoViableAltException
 [findbugs]   org.apache.hadoop.util.GenericOptionsParser
 [findbugs]   org.apache.hadoop.mapreduce.JobID
 [findbugs]   org.apache.hadoop.mapreduce.TaskAttemptID
 [findbugs]   org.apache.hadoop.filecache.DistributedCache
 [findbugs]   org.apache.hadoop.fs.FSDataOutputStream
 [findbugs]   org.python.core.PyList
 [findbugs]   org.antlr.runtime.tree.TreeNodeStream
 [findbugs]   org.apache.hadoop.hbase.filter.BinaryComparator
 [findbugs]   dk.brics.automaton.RegExp
 [findbugs]   org.mozilla.javascript.Scriptable
 [findbugs]   org.mozilla.javascript.EcmaError
 [findbugs]   org.apache.hadoop.io.serializer.Serializer
 [findbugs]   org.apache.hadoop.util.bloom.Filter
 [findbugs]   org.python.core.PyNone
 [findbugs]   org.mozilla.javascript.Function
 [findbugs]   org.python.core.PySystemState
 [findbugs]   org.antlr.runtime.RecognizerSharedState
 [findbugs]   org.codehaus.jackson.JsonFactory
 [findbugs]   org.antlr.runtime.EarlyExitException
 [findbugs]   org.apache.hadoop.hdfs.DistributedFileSystem
 [findbugs]   org.apache.hadoop.util.LineReader
 [findbugs] Warnings generated: 23
 [findbugs] Missing classes: 231
 [findbugs] Calculating exit code...
 [findbugs] Setting 'missing class' flag (2)
 [findbugs] Setting 'bugs found' flag (1)
 [findbugs] Exit code set to: 3
 [findbugs] Java Result: 3
 [findbugs] Classes needed for analysis were missing
 [findbugs] Output saved to <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/findbugs/pig-findbugs-report.xml>
     [xslt] Processing <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/findbugs/pig-findbugs-report.xml> to <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/findbugs/pig-findbugs-report.html>
     [xslt] Loading stylesheet /home/jenkins/tools/findbugs/latest/src/xsl/default.xsl

BUILD SUCCESSFUL
Total time: 8 minutes 20 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================




======================================================================
======================================================================
CLEAN: cleaning workspace
======================================================================
======================================================================


Buildfile: build.xml

clean:
   [delete] Deleting directory <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen>
   [delete] Deleting directory <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/docs/build>
   [delete] Deleting directory <https://builds.apache.org/job/Pig-trunk/ws/trunk/build>
   [delete] Deleting directory <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser>
   [delete] Deleting: <https://builds.apache.org/job/Pig-trunk/ws/trunk/pig.jar>
   [delete] Deleting: <https://builds.apache.org/job/Pig-trunk/ws/trunk/pig-withouthadoop.jar>

clean:

clean:

BUILD SUCCESSFUL
Total time: 0 seconds


======================================================================
======================================================================
ANALYSIS: ant -Drun.clover=true -Dclover.home=/homes/hudson/tools/clover/latest clover test-commit generate-clover-reports -Dtest.junit.output.format=xml -Dtest.output=yes -Dversion=${BUILD_ID} -Dfindbugs.home=$FINDBUGS_HOME -Djava5.home=$JAVA5_HOME -Dforrest.home=$FORREST_HOME -Dclover.home=$CLOVER_HOME -Declipse.home=$ECLIPSE_HOME
======================================================================
======================================================================


Buildfile: build.xml

clover.setup:
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/clover/db>
[clover-setup] Clover Version 3.1.0, built on May 31 2011 (build-821)
[clover-setup] Loaded from: /home/jenkins/tools/clover/latest/lib/clover.jar

BUILD FAILED
java.lang.RuntimeException: Clover upgrades for your license ended December 14 2010, and this version of Clover was built May 31 2011. Please visit http://www.atlassian.com/clover/renew for information on upgrading your license.
	at com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:103)
	at com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:25)
	at com.cenqua.clover.tasks.AbstractCloverTask.execute(AbstractCloverTask.java:52)
	at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
	at org.apache.tools.ant.Task.perform(Task.java:348)
	at org.apache.tools.ant.Target.execute(Target.java:357)
	at org.apache.tools.ant.Target.performTasks(Target.java:385)
	at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
	at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
	at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
	at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
	at org.apache.tools.ant.Main.runBuild(Main.java:758)
	at org.apache.tools.ant.Main.startAnt(Main.java:217)
	at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
	at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)

Total time: 1 second
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording test results
Publishing Javadoc
Archiving artifacts
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure


Re: Build failed in Jenkins: Pig-trunk #1174

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Hm. I just tried TestStore and it passed from a clean checkout on my
machine.

Looking at the failing test in Jenkins, it's got stack traces like this one
below.. anyone know what could be the cause?

12/01/17 22:33:13 INFO mapred.TaskInProgress: Error from
attempt_20120117222456151_0012_m_000001_0:
java.lang.NumberFormatException: For input string:
"18446743988250694508"
	at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
	at java.lang.Long.parseLong(Long.java:422)
	at java.lang.Long.parseLong(Long.java:468)
	at org.apache.hadoop.util.ProcfsBasedProcessTree.constructProcessInfo(ProcfsBasedProcessTree.java:413)
	at org.apache.hadoop.util.ProcfsBasedProcessTree.getProcessTree(ProcfsBasedProcessTree.java:148)
	at org.apache.hadoop.util.LinuxResourceCalculatorPlugin.getProcResourceValues(LinuxResourceCalculatorPlugin.java:401)
	at org.apache.hadoop.mapred.Task.initialize(Task.java:536)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:353)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)


On Tue, Jan 17, 2012 at 2:41 PM, Apache Jenkins Server <
jenkins@builds.apache.org> wrote:

> See <https://builds.apache.org/job/Pig-trunk/1174/changes>
>
> Changes:
>
> [dvryaboy] PIG-2359: Support more efficient Tuples when schemas are known
> (part 2)
>
> ------------------------------------------
> [...truncated 37037 lines...]
>    [junit]     at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>    [junit]     at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>    [junit]     at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
>    [junit]     at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>    [junit]     at
> junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
>    [junit] 12/01/17 22:33:18 WARN datanode.FSDatasetAsyncDiskService:
> AsyncDiskService has already shut down.
>    [junit] 12/01/17 22:33:18 INFO mortbay.log: Stopped
> SelectChannelConnector@localhost:0
>    [junit] 12/01/17 22:33:18 INFO ipc.Server: Stopping server on 56123
>    [junit] 12/01/17 22:33:18 INFO ipc.Server: IPC Server handler 0 on
> 56123: exiting
>    [junit] 12/01/17 22:33:18 INFO ipc.Server: Stopping IPC Server listener
> on 56123
>    [junit] 12/01/17 22:33:18 INFO ipc.Server: IPC Server handler 2 on
> 56123: exiting
>    [junit] 12/01/17 22:33:18 INFO ipc.Server: Stopping IPC Server Responder
>    [junit] 12/01/17 22:33:18 INFO metrics.RpcInstrumentation: shut down
>    [junit] 12/01/17 22:33:18 INFO ipc.Server: IPC Server handler 1 on
> 56123: exiting
>    [junit] 12/01/17 22:33:18 INFO datanode.DataNode: Waiting for
> threadgroup to exit, active threads is 1
>    [junit] 12/01/17 22:33:18 WARN datanode.DataNode: DatanodeRegistration(
> 127.0.0.1:47324,
> storageID=DS-1849659132-67.195.138.20-47324-1326839095771, infoPort=37487,
> ipcPort=56123):DataXceiveServer:java.nio.channels.AsynchronousCloseException
>    [junit]     at
> java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
>    [junit]     at
> sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
>    [junit]     at
> sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
>    [junit]     at java.lang.Thread.run(Thread.java:662)
>    [junit]
>    [junit] 12/01/17 22:33:18 INFO datanode.DataNode: Exiting
> DataXceiveServer
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block
> blk_-5914952160057491263_1095 file
> build/test/data/dfs/data/data2/current/blk_-5914952160057491263 for deletion
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block
> blk_-3299861067875593375_1102 file
> build/test/data/dfs/data/data2/current/blk_-3299861067875593375 for deletion
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block
> blk_-2378485608018695930_1101 file
> build/test/data/dfs/data/data2/current/blk_-2378485608018695930 for deletion
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block
> blk_-5914952160057491263_1095 at file
> build/test/data/dfs/data/data2/current/blk_-5914952160057491263
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block
> blk_1266225547534690160_1102 file
> build/test/data/dfs/data/data1/current/blk_1266225547534690160 for deletion
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block
> blk_-3299861067875593375_1102 at file
> build/test/data/dfs/data/data2/current/blk_-3299861067875593375
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block
> blk_1266225547534690160_1102 at file
> build/test/data/dfs/data/data1/current/blk_1266225547534690160
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block
> blk_-2378485608018695930_1101 at file
> build/test/data/dfs/data/data2/current/blk_-2378485608018695930
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block
> blk_-5914952160057491263_1095 file
> build/test/data/dfs/data/data3/current/blk_-5914952160057491263 for deletion
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block
> blk_-3299861067875593375_1102 file
> build/test/data/dfs/data/data4/current/blk_-3299861067875593375 for deletion
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block
> blk_-5914952160057491263_1095 at file
> build/test/data/dfs/data/data3/current/blk_-5914952160057491263
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block
> blk_-2378485608018695930_1101 file
> build/test/data/dfs/data/data4/current/blk_-2378485608018695930 for deletion
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block
> blk_-3299861067875593375_1102 at file
> build/test/data/dfs/data/data4/current/blk_-3299861067875593375
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block
> blk_1266225547534690160_1102 file
> build/test/data/dfs/data/data3/current/blk_1266225547534690160 for deletion
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block
> blk_-2378485608018695930_1101 at file
> build/test/data/dfs/data/data4/current/blk_-2378485608018695930
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block
> blk_1266225547534690160_1102 at file
> build/test/data/dfs/data/data3/current/blk_1266225547534690160
>    [junit] 12/01/17 22:33:19 INFO datanode.DataBlockScanner: Exiting
> DataBlockScanner thread.
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Waiting for
> threadgroup to exit, active threads is 0
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: DatanodeRegistration(
> 127.0.0.1:47324,
> storageID=DS-1849659132-67.195.138.20-47324-1326839095771, infoPort=37487,
> ipcPort=56123):Finishing DataNode in: FSDataset{dirpath='<
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data5/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data6/current'
> }>
>    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping server on 56123
>    [junit] 12/01/17 22:33:19 INFO metrics.RpcInstrumentation: shut down
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Waiting for
> threadgroup to exit, active threads is 0
>    [junit] 12/01/17 22:33:19 INFO datanode.FSDatasetAsyncDiskService:
> Shutting down all async disk service threads...
>    [junit] 12/01/17 22:33:19 INFO datanode.FSDatasetAsyncDiskService: All
> async disk service threads have been shut down.
>    [junit] 12/01/17 22:33:19 WARN util.MBeans:
> Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1509900428
>    [junit] Shutting down DataNode 1
>    [junit] javax.management.InstanceNotFoundException:
> Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1509900428
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
>    [junit]     at
> com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
>    [junit]     at
> org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
>    [junit]     at
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
>    [junit]     at
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
>    [junit]     at
> org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
>    [junit]     at
> org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
>    [junit]     at
> org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
>    [junit]     at
> org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
>    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit]     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>    [junit]     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>    [junit]     at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>    [junit]     at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>    [junit]     at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>    [junit]     at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
>    [junit]     at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>    [junit]     at
> junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
>    [junit] 12/01/17 22:33:19 WARN datanode.FSDatasetAsyncDiskService:
> AsyncDiskService has already shut down.
>    [junit] 12/01/17 22:33:19 INFO mortbay.log: Stopped
> SelectChannelConnector@localhost:0
>    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping server on 40885
>    [junit] 12/01/17 22:33:19 INFO ipc.Server: IPC Server handler 0 on
> 40885: exiting
>    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping IPC Server listener
> on 40885
>    [junit] 12/01/17 22:33:19 INFO metrics.RpcInstrumentation: shut down
>    [junit] 12/01/17 22:33:19 INFO ipc.Server: IPC Server handler 1 on
> 40885: exiting
>    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping IPC Server Responder
>    [junit] 12/01/17 22:33:19 WARN datanode.DataNode: DatanodeRegistration(
> 127.0.0.1:45088,
> storageID=DS-1863537504-67.195.138.20-45088-1326839095442, infoPort=49950,
> ipcPort=40885):DataXceiveServer:java.nio.channels.AsynchronousCloseException
>    [junit]     at
> java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
>    [junit]     at
> sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
>    [junit]     at
> sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
>    [junit]     at java.lang.Thread.run(Thread.java:662)
>    [junit]
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Exiting
> DataXceiveServer
>    [junit] 12/01/17 22:33:19 INFO ipc.Server: IPC Server handler 2 on
> 40885: exiting
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Waiting for
> threadgroup to exit, active threads is 1
>    [junit] 12/01/17 22:33:19 INFO datanode.DataBlockScanner: Exiting
> DataBlockScanner thread.
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: DatanodeRegistration(
> 127.0.0.1:45088,
> storageID=DS-1863537504-67.195.138.20-45088-1326839095442, infoPort=49950,
> ipcPort=40885):Finishing DataNode in: FSDataset{dirpath='<
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data3/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data4/current'
> }>
>    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping server on 40885
>    [junit] 12/01/17 22:33:19 INFO metrics.RpcInstrumentation: shut down
>    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Waiting for
> threadgroup to exit, active threads is 0
>    [junit] 12/01/17 22:33:19 INFO datanode.FSDatasetAsyncDiskService:
> Shutting down all async disk service threads...
>    [junit] 12/01/17 22:33:19 INFO datanode.FSDatasetAsyncDiskService: All
> async disk service threads have been shut down.
>    [junit] 12/01/17 22:33:19 WARN util.MBeans:
> Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-1895231096
>    [junit] javax.management.InstanceNotFoundException:
> Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-1895231096
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
>    [junit]     at
> com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
>    [junit]     at
> org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
>    [junit]     at
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
>    [junit]     at
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
>    [junit]     at
> org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
>    [junit]     at
> org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
>    [junit]     at
> org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
>    [junit]     at
> org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
>    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit]     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>    [junit]     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>    [junit]     at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>    [junit]     at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>    [junit]     at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>    [junit]     at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
>    [junit]     at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>    [junit]     at
> junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
>    [junit] 12/01/17 22:33:19 WARN datanode.FSDatasetAsyncDiskService:
> AsyncDiskService has already shut down.
>    [junit] Shutting down DataNode 0
>    [junit] 12/01/17 22:33:19 INFO mortbay.log: Stopped
> SelectChannelConnector@localhost:0
>    [junit] 12/01/17 22:33:20 INFO ipc.Server: Stopping server on 41896
>    [junit] 12/01/17 22:33:20 INFO ipc.Server: IPC Server handler 0 on
> 41896: exiting
>    [junit] 12/01/17 22:33:20 INFO ipc.Server: Stopping IPC Server Responder
>    [junit] 12/01/17 22:33:20 INFO ipc.Server: IPC Server handler 1 on
> 41896: exiting
>    [junit] 12/01/17 22:33:20 INFO ipc.Server: IPC Server handler 2 on
> 41896: exiting
>    [junit] 12/01/17 22:33:20 INFO ipc.Server: Stopping IPC Server listener
> on 41896
>    [junit] 12/01/17 22:33:20 INFO metrics.RpcInstrumentation: shut down
>    [junit] 12/01/17 22:33:20 INFO datanode.DataNode: Waiting for
> threadgroup to exit, active threads is 1
>    [junit] 12/01/17 22:33:20 WARN datanode.DataNode: DatanodeRegistration(
> 127.0.0.1:49647,
> storageID=DS-578389708-67.195.138.20-49647-1326839095094, infoPort=59242,
> ipcPort=41896):DataXceiveServer:java.nio.channels.AsynchronousCloseException
>    [junit]     at
> java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
>    [junit]     at
> sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
>    [junit]     at
> sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
>    [junit]     at java.lang.Thread.run(Thread.java:662)
>    [junit]
>    [junit] 12/01/17 22:33:20 INFO datanode.DataNode: Exiting
> DataXceiveServer
>    [junit] 12/01/17 22:33:20 INFO datanode.DataBlockScanner: Exiting
> DataBlockScanner thread.
>    [junit] 12/01/17 22:33:20 INFO hdfs.StateChange: BLOCK* ask
> 127.0.0.1:59660 to delete  blk_-5914952160057491263_1095
>    [junit] 12/01/17 22:33:20 INFO hdfs.StateChange: BLOCK* ask
> 127.0.0.1:47324 to delete  blk_-2378485608018695930_1101
> blk_-3299861067875593375_1102 blk_1266225547534690160_1102
>    [junit] 12/01/17 22:33:21 INFO datanode.DataNode: Waiting for
> threadgroup to exit, active threads is 0
>    [junit] 12/01/17 22:33:21 INFO datanode.DataNode: DatanodeRegistration(
> 127.0.0.1:49647,
> storageID=DS-578389708-67.195.138.20-49647-1326839095094, infoPort=59242,
> ipcPort=41896):Finishing DataNode in: FSDataset{dirpath='<
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data1/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data2/current'
> }>
>    [junit] 12/01/17 22:33:21 WARN util.MBeans:
> Hadoop:service=DataNode,name=DataNodeInfo
>    [junit] javax.management.InstanceNotFoundException:
> Hadoop:service=DataNode,name=DataNodeInfo
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
>    [junit]     at
> com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
>    [junit]     at
> org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.unRegisterMXBean(DataNode.java:513)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:726)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.run(DataNode.java:1442)
>    [junit]     at java.lang.Thread.run(Thread.java:662)
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: Stopping server on 41896
>    [junit] 12/01/17 22:33:21 INFO metrics.RpcInstrumentation: shut down
>    [junit] 12/01/17 22:33:21 INFO datanode.DataNode: Waiting for
> threadgroup to exit, active threads is 0
>    [junit] 12/01/17 22:33:21 INFO datanode.FSDatasetAsyncDiskService:
> Shutting down all async disk service threads...
>    [junit] 12/01/17 22:33:21 INFO datanode.FSDatasetAsyncDiskService: All
> async disk service threads have been shut down.
>    [junit] 12/01/17 22:33:21 WARN util.MBeans:
> Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-982927745
>    [junit] javax.management.InstanceNotFoundException:
> Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-982927745
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
>    [junit]     at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
>    [junit]     at
> com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
>    [junit]     at
> org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
>    [junit]     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
>    [junit]     at
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
>    [junit]     at
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
>    [junit]     at
> org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
>    [junit]     at
> org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
>    [junit]     at
> org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
>    [junit]     at
> org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
>    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit]     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>    [junit]     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>    [junit]     at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>    [junit]     at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>    [junit]     at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>    [junit]     at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
>    [junit]     at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
>    [junit]     at
> junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
>    [junit] 12/01/17 22:33:21 WARN datanode.FSDatasetAsyncDiskService:
> AsyncDiskService has already shut down.
>    [junit] 12/01/17 22:33:21 INFO mortbay.log: Stopped
> SelectChannelConnector@localhost:0
>    [junit] 12/01/17 22:33:21 WARN namenode.FSNamesystem:
> ReplicationMonitor thread received
> InterruptedException.java.lang.InterruptedException: sleep interrupted
>    [junit] 12/01/17 22:33:21 INFO namenode.FSNamesystem: Number of
> transactions: 502 Total time for transactions(ms): 10Number of transactions
> batched in Syncs: 165 Number of syncs: 349 SyncTimes(ms): 3426 280
>    [junit] 12/01/17 22:33:21 INFO namenode.DecommissionManager:
> Interrupted Monitor
>    [junit] java.lang.InterruptedException: sleep interrupted
>    [junit]     at java.lang.Thread.sleep(Native Method)
>    [junit]     at
> org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
>    [junit]     at java.lang.Thread.run(Thread.java:662)
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: Stopping server on 36202
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 0 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 2 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 1 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 3 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 4 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 5 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: Stopping IPC Server listener
> on 36202
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 6 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 7 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 9 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 8 on
> 36202: exiting
>    [junit] 12/01/17 22:33:21 INFO ipc.Server: Stopping IPC Server Responder
>    [junit] 12/01/17 22:33:21 INFO metrics.RpcInstrumentation: shut down
>    [junit] Tests run: 17, Failures: 3, Errors: 3, Time elapsed: 499.771 sec
>    [junit] Test org.apache.pig.test.TestStore FAILED
>    [junit] Running org.apache.pig.test.TestStringUDFs
>    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object
> provided to UDF: org.apache.pig.builtin.SUBSTRING.
> java.lang.NullPointerException
>    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object
> provided to UDF: org.apache.pig.builtin.SUBSTRING.
> java.lang.StringIndexOutOfBoundsException: String index out of range: -2
>    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object
> provided to UDF: org.apache.pig.builtin.SUBSTRING.
> java.lang.StringIndexOutOfBoundsException: String index out of range: -1
>    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object
> provided to UDF: org.apache.pig.builtin.SUBSTRING.
> java.lang.StringIndexOutOfBoundsException: String index out of range: -8
>    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object
> provided to UDF: org.apache.pig.builtin.SUBSTRING.
> java.lang.StringIndexOutOfBoundsException: String index out of range: -2
>    [junit] 12/01/17 22:33:22 WARN builtin.INDEXOF: No logger object
> provided to UDF: org.apache.pig.builtin.INDEXOF. Failed to process input;
> error - null
>    [junit] 12/01/17 22:33:22 WARN builtin.LAST_INDEX_OF: No logger object
> provided to UDF: org.apache.pig.builtin.LAST_INDEX_OF. Failed to process
> input; error - null
>    [junit] Tests run: 11, Failures: 0, Errors: 0, Time elapsed: 0.102 sec
>   [delete] Deleting directory /tmp/pig_junit_tmp1053430922
>
> BUILD FAILED
> <https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:774: The
> following error occurred while executing this line:
> <https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:831: Tests
> failed!
>
> Total time: 22 minutes 12 seconds
> Build step 'Execute shell' marked build as failure
> [FINDBUGS] Skipping publisher since build result is FAILURE
> Recording test results
> Publishing Javadoc
> Archiving artifacts
> Recording fingerprints
> Publishing Clover coverage report...
> No Clover report will be published due to a Build Failure
>
>

Build failed in Jenkins: Pig-trunk #1174

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Pig-trunk/1174/changes>

Changes:

[dvryaboy] PIG-2359: Support more efficient Tuples when schemas are known (part 2)

------------------------------------------
[...truncated 37037 lines...]
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/17 22:33:18 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/17 22:33:18 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/17 22:33:18 INFO ipc.Server: Stopping server on 56123
    [junit] 12/01/17 22:33:18 INFO ipc.Server: IPC Server handler 0 on 56123: exiting
    [junit] 12/01/17 22:33:18 INFO ipc.Server: Stopping IPC Server listener on 56123
    [junit] 12/01/17 22:33:18 INFO ipc.Server: IPC Server handler 2 on 56123: exiting
    [junit] 12/01/17 22:33:18 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/17 22:33:18 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/17 22:33:18 INFO ipc.Server: IPC Server handler 1 on 56123: exiting
    [junit] 12/01/17 22:33:18 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/17 22:33:18 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:47324, storageID=DS-1849659132-67.195.138.20-47324-1326839095771, infoPort=37487, ipcPort=56123):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/17 22:33:18 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block blk_-5914952160057491263_1095 file build/test/data/dfs/data/data2/current/blk_-5914952160057491263 for deletion
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block blk_-3299861067875593375_1102 file build/test/data/dfs/data/data2/current/blk_-3299861067875593375 for deletion
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block blk_-2378485608018695930_1101 file build/test/data/dfs/data/data2/current/blk_-2378485608018695930 for deletion
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block blk_-5914952160057491263_1095 at file build/test/data/dfs/data/data2/current/blk_-5914952160057491263
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block blk_1266225547534690160_1102 file build/test/data/dfs/data/data1/current/blk_1266225547534690160 for deletion
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block blk_-3299861067875593375_1102 at file build/test/data/dfs/data/data2/current/blk_-3299861067875593375
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block blk_1266225547534690160_1102 at file build/test/data/dfs/data/data1/current/blk_1266225547534690160
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block blk_-2378485608018695930_1101 at file build/test/data/dfs/data/data2/current/blk_-2378485608018695930
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block blk_-5914952160057491263_1095 file build/test/data/dfs/data/data3/current/blk_-5914952160057491263 for deletion
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block blk_-3299861067875593375_1102 file build/test/data/dfs/data/data4/current/blk_-3299861067875593375 for deletion
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block blk_-5914952160057491263_1095 at file build/test/data/dfs/data/data3/current/blk_-5914952160057491263
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block blk_-2378485608018695930_1101 file build/test/data/dfs/data/data4/current/blk_-2378485608018695930 for deletion
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block blk_-3299861067875593375_1102 at file build/test/data/dfs/data/data4/current/blk_-3299861067875593375
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Scheduling block blk_1266225547534690160_1102 file build/test/data/dfs/data/data3/current/blk_1266225547534690160 for deletion
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block blk_-2378485608018695930_1101 at file build/test/data/dfs/data/data4/current/blk_-2378485608018695930
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Deleted block blk_1266225547534690160_1102 at file build/test/data/dfs/data/data3/current/blk_1266225547534690160
    [junit] 12/01/17 22:33:19 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:47324, storageID=DS-1849659132-67.195.138.20-47324-1326839095771, infoPort=37487, ipcPort=56123):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data5/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data6/current'}>
    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping server on 56123
    [junit] 12/01/17 22:33:19 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/17 22:33:19 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/17 22:33:19 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/17 22:33:19 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1509900428
    [junit] Shutting down DataNode 1
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1509900428
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/17 22:33:19 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/17 22:33:19 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping server on 40885
    [junit] 12/01/17 22:33:19 INFO ipc.Server: IPC Server handler 0 on 40885: exiting
    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping IPC Server listener on 40885
    [junit] 12/01/17 22:33:19 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/17 22:33:19 INFO ipc.Server: IPC Server handler 1 on 40885: exiting
    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/17 22:33:19 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:45088, storageID=DS-1863537504-67.195.138.20-45088-1326839095442, infoPort=49950, ipcPort=40885):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/17 22:33:19 INFO ipc.Server: IPC Server handler 2 on 40885: exiting
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/17 22:33:19 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:45088, storageID=DS-1863537504-67.195.138.20-45088-1326839095442, infoPort=49950, ipcPort=40885):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data3/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data4/current'}>
    [junit] 12/01/17 22:33:19 INFO ipc.Server: Stopping server on 40885
    [junit] 12/01/17 22:33:19 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/17 22:33:19 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/17 22:33:19 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/17 22:33:19 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/17 22:33:19 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-1895231096
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-1895231096
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/17 22:33:19 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] Shutting down DataNode 0
    [junit] 12/01/17 22:33:19 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/17 22:33:20 INFO ipc.Server: Stopping server on 41896
    [junit] 12/01/17 22:33:20 INFO ipc.Server: IPC Server handler 0 on 41896: exiting
    [junit] 12/01/17 22:33:20 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/17 22:33:20 INFO ipc.Server: IPC Server handler 1 on 41896: exiting
    [junit] 12/01/17 22:33:20 INFO ipc.Server: IPC Server handler 2 on 41896: exiting
    [junit] 12/01/17 22:33:20 INFO ipc.Server: Stopping IPC Server listener on 41896
    [junit] 12/01/17 22:33:20 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/17 22:33:20 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/17 22:33:20 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:49647, storageID=DS-578389708-67.195.138.20-49647-1326839095094, infoPort=59242, ipcPort=41896):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/17 22:33:20 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/17 22:33:20 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/17 22:33:20 INFO hdfs.StateChange: BLOCK* ask 127.0.0.1:59660 to delete  blk_-5914952160057491263_1095
    [junit] 12/01/17 22:33:20 INFO hdfs.StateChange: BLOCK* ask 127.0.0.1:47324 to delete  blk_-2378485608018695930_1101 blk_-3299861067875593375_1102 blk_1266225547534690160_1102
    [junit] 12/01/17 22:33:21 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/17 22:33:21 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:49647, storageID=DS-578389708-67.195.138.20-49647-1326839095094, infoPort=59242, ipcPort=41896):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data1/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data2/current'}>
    [junit] 12/01/17 22:33:21 WARN util.MBeans: Hadoop:service=DataNode,name=DataNodeInfo
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=DataNodeInfo
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.unRegisterMXBean(DataNode.java:513)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:726)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.run(DataNode.java:1442)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 12/01/17 22:33:21 INFO ipc.Server: Stopping server on 41896
    [junit] 12/01/17 22:33:21 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/17 22:33:21 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/17 22:33:21 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/17 22:33:21 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/17 22:33:21 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-982927745
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-982927745
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/17 22:33:21 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/17 22:33:21 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/17 22:33:21 WARN namenode.FSNamesystem: ReplicationMonitor thread received InterruptedException.java.lang.InterruptedException: sleep interrupted
    [junit] 12/01/17 22:33:21 INFO namenode.FSNamesystem: Number of transactions: 502 Total time for transactions(ms): 10Number of transactions batched in Syncs: 165 Number of syncs: 349 SyncTimes(ms): 3426 280 
    [junit] 12/01/17 22:33:21 INFO namenode.DecommissionManager: Interrupted Monitor
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 12/01/17 22:33:21 INFO ipc.Server: Stopping server on 36202
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 0 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 2 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 1 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 3 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 4 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 5 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: Stopping IPC Server listener on 36202
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 6 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 7 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 9 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: IPC Server handler 8 on 36202: exiting
    [junit] 12/01/17 22:33:21 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/17 22:33:21 INFO metrics.RpcInstrumentation: shut down
    [junit] Tests run: 17, Failures: 3, Errors: 3, Time elapsed: 499.771 sec
    [junit] Test org.apache.pig.test.TestStore FAILED
    [junit] Running org.apache.pig.test.TestStringUDFs
    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.NullPointerException
    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2
    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -1
    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -8
    [junit] 12/01/17 22:33:22 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2
    [junit] 12/01/17 22:33:22 WARN builtin.INDEXOF: No logger object provided to UDF: org.apache.pig.builtin.INDEXOF. Failed to process input; error - null
    [junit] 12/01/17 22:33:22 WARN builtin.LAST_INDEX_OF: No logger object provided to UDF: org.apache.pig.builtin.LAST_INDEX_OF. Failed to process input; error - null
    [junit] Tests run: 11, Failures: 0, Errors: 0, Time elapsed: 0.102 sec
   [delete] Deleting directory /tmp/pig_junit_tmp1053430922

BUILD FAILED
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:774: The following error occurred while executing this line:
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:831: Tests failed!

Total time: 22 minutes 12 seconds
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording test results
Publishing Javadoc
Archiving artifacts
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure


Re: Build failed in Jenkins: Pig-trunk #1173

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Ok added. Updated into a separate checkout from svn, ran test-commit, it
passes.
This should be fixed now.

On Tue, Jan 17, 2012 at 5:34 AM, Dmitriy Ryaboy <dv...@gmail.com> wrote:

> Ugh. I forgot to "svn add" the new files. Will do that once  I'm online in
> a few hours.
>
> On Jan 17, 2012, at 2:06 AM, Apache Jenkins Server <
> jenkins@builds.apache.org> wrote:
>
> > See <https://builds.apache.org/job/Pig-trunk/1173/changes>
> >
> > Changes:
> >
> > [gates] Remove zero length file.
> >
> > [dvryaboy] PIG-2359: Support more efficient Tuples when schemas are known
> >
> > ------------------------------------------
> > [...truncated 2437 lines...]
> > [ivy:resolve]    found net.sf.kosmosfs#kfs;0.3 in default
> > [ivy:resolve]    found hsqldb#hsqldb;1.8.0.10 in maven2
> > [ivy:resolve]    found org.apache.hadoop#hadoop-test;1.0.0 in maven2
> > [ivy:resolve]    found org.apache.ftpserver#ftplet-api;1.0.0 in maven2
> > [ivy:resolve]    found org.apache.mina#mina-core;2.0.0-M5 in maven2
> > [ivy:resolve]    found org.slf4j#slf4j-api;1.5.2 in maven2
> > [ivy:resolve]    found org.apache.ftpserver#ftpserver-core;1.0.0 in
> maven2
> > [ivy:resolve]    found
> org.apache.ftpserver#ftpserver-deprecated;1.0.0-M2 in maven2
> > [ivy:resolve]    found org.apache.httpcomponents#httpclient;4.1 in maven2
> > [ivy:resolve]    found org.apache.httpcomponents#httpcore;4.1 in maven2
> > [ivy:resolve]    found log4j#log4j;1.2.16 in fs
> > [ivy:resolve]    found org.slf4j#slf4j-log4j12;1.6.1 in fs
> > [ivy:resolve]    found org.apache.avro#avro;1.5.3 in default
> > [ivy:resolve]    found org.codehaus.jackson#jackson-mapper-asl;1.7.3 in
> maven2
> > [ivy:resolve]    found org.codehaus.jackson#jackson-core-asl;1.7.3 in
> maven2
> > [ivy:resolve]    found com.thoughtworks.paranamer#paranamer;2.3 in
> default
> > [ivy:resolve]    found org.xerial.snappy#snappy-java;1.0.3.2 in default
> > [ivy:resolve]    found org.slf4j#slf4j-api;1.6.1 in default
> > [ivy:resolve]    found com.googlecode.json-simple#json-simple;1.1 in
> maven2
> > [ivy:resolve]    found com.jcraft#jsch;0.1.38 in maven2
> > [ivy:resolve]    found jline#jline;0.9.94 in default
> > [ivy:resolve]    found net.java.dev.javacc#javacc;4.2 in maven2
> > [ivy:resolve]    found joda-time#joda-time;1.6 in maven2
> > [ivy:resolve]    found com.google.guava#guava;11.0 in maven2
> > [ivy:resolve]    found org.python#jython;2.5.0 in maven2
> > [ivy:resolve]    found rhino#js;1.7R2 in maven2
> > [ivy:resolve]    found org.antlr#antlr;3.4 in maven2
> > [ivy:resolve]    found org.antlr#antlr-runtime;3.4 in maven2
> > [ivy:resolve]    found org.antlr#stringtemplate;3.2.1 in maven2
> > [ivy:resolve]    found antlr#antlr;2.7.7 in maven2
> > [ivy:resolve]    found org.antlr#ST4;4.0.4 in maven2
> > [ivy:resolve]    found org.apache.zookeeper#zookeeper;3.3.3 in maven2
> > [ivy:resolve]    found org.jboss.netty#netty;3.2.2.Final in maven2
> > [ivy:resolve]    found dk.brics.automaton#automaton;1.11-8 in maven2
> > [ivy:resolve]    found org.apache.hbase#hbase;0.90.0 in maven2
> > [ivy:resolve]    found org.vafer#jdeb;0.8 in maven2
> > [ivy:resolve] :: resolution report :: resolve 1323ms :: artifacts dl 47ms
> > [ivy:resolve]    :: evicted modules:
> > [ivy:resolve]    commons-logging#commons-logging;1.0.3 by
> [commons-logging#commons-logging;1.1.1] in [compile]
> > [ivy:resolve]    commons-codec#commons-codec;1.2 by
> [commons-codec#commons-codec;1.4] in [compile]
> > [ivy:resolve]    commons-logging#commons-logging;1.1 by
> [commons-logging#commons-logging;1.1.1] in [compile]
> > [ivy:resolve]    commons-codec#commons-codec;1.3 by
> [commons-codec#commons-codec;1.4] in [compile]
> > [ivy:resolve]    commons-httpclient#commons-httpclient;3.1 by
> [commons-httpclient#commons-httpclient;3.0.1] in [compile]
> > [ivy:resolve]    org.codehaus.jackson#jackson-mapper-asl;1.0.1 by
> [org.codehaus.jackson#jackson-mapper-asl;1.7.3] in [compile]
> > [ivy:resolve]    org.slf4j#slf4j-api;1.5.2 by
> [org.slf4j#slf4j-api;1.6.1] in [compile]
> > [ivy:resolve]    org.apache.mina#mina-core;2.0.0-M4 by
> [org.apache.mina#mina-core;2.0.0-M5] in [compile]
> > [ivy:resolve]    org.apache.ftpserver#ftplet-api;1.0.0-M2 by
> [org.apache.ftpserver#ftplet-api;1.0.0] in [compile]
> > [ivy:resolve]    org.apache.ftpserver#ftpserver-core;1.0.0-M2 by
> [org.apache.ftpserver#ftpserver-core;1.0.0] in [compile]
> > [ivy:resolve]    org.apache.mina#mina-core;2.0.0-M2 by
> [org.apache.mina#mina-core;2.0.0-M5] in [compile]
> > [ivy:resolve]    commons-cli#commons-cli;1.0 by
> [commons-cli#commons-cli;1.2] in [compile]
> > [ivy:resolve]    org.antlr#antlr-runtime;3.3 by
> [org.antlr#antlr-runtime;3.4] in [compile]
> >    ---------------------------------------------------------------------
> >    |                  |            modules            ||   artifacts   |
> >    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
> >    ---------------------------------------------------------------------
> >    |      compile     |   77  |   0   |   0   |   13  ||   65  |   0   |
> >    ---------------------------------------------------------------------
> > [ivy:retrieve] :: retrieving :: org.apache.pig#Pig
> > [ivy:retrieve]    confs: [compile]
> > [ivy:retrieve]    65 artifacts copied, 0 already retrieved
> (38935kB/171ms)
> > [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
> 'ivy.settings.file' instead
> > [ivy:cachepath] :: loading settings :: file = <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/ivy/ivysettings.xml>
> >
> > init:
> >    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/impl/logicalLayer/parser
> >
> >    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/tools/pigscript/parser
> >
> >    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/tools/parameters
> >
> >    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/pig-2012-01-17_10-05-52
> >
> >    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/classes>
> >    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/classes>
> >    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser
> >
> >    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/data/parser
> >
> >     [move] Moving 1 file to <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/ivy/lib/Pig>
> >
> > cc-compile:
> >   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
> >   [javacc] (type "javacc" with no arguments for help)
> >   [javacc] Reading from file <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/tools/pigscript/parser/PigScriptParser.jj>
> . . .
> >   [javacc] File "TokenMgrError.java" does not exist.  Will create one.
> >   [javacc] File "ParseException.java" does not exist.  Will create one.
> >   [javacc] File "Token.java" does not exist.  Will create one.
> >   [javacc] File "JavaCharStream.java" does not exist.  Will create one.
> >   [javacc] Parser generated successfully.
> >   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
> >   [javacc] (type "javacc" with no arguments for help)
> >   [javacc] Reading from file <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/tools/parameters/PigFileParser.jj>
> . . .
> >   [javacc] Warning: Lookahead adequacy checking not being performed
> since option LOOKAHEAD is more than 1.  Set option FORCE_LA_CHECK to true
> to force checking.
> >   [javacc] File "TokenMgrError.java" does not exist.  Will create one.
> >   [javacc] File "ParseException.java" does not exist.  Will create one.
> >   [javacc] File "Token.java" does not exist.  Will create one.
> >   [javacc] File "JavaCharStream.java" does not exist.  Will create one.
> >   [javacc] Parser generated with 0 errors and 1 warnings.
> >   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
> >   [javacc] (type "javacc" with no arguments for help)
> >   [javacc] Reading from file <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/tools/parameters/ParamLoader.jj>
> . . .
> >   [javacc] File "TokenMgrError.java" is being rebuilt.
> >   [javacc] File "ParseException.java" is being rebuilt.
> >   [javacc] File "Token.java" is being rebuilt.
> >   [javacc] File "JavaCharStream.java" is being rebuilt.
> >   [javacc] Parser generated successfully.
> >   [jjtree] Java Compiler Compiler Version 4.2 (Tree Builder)
> >   [jjtree] (type "jjtree" with no arguments for help)
> >   [jjtree] Reading from file <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/DOTParser.jjt>
> . . .
> >   [jjtree] File "Node.java" does not exist.  Will create one.
> >   [jjtree] File "SimpleNode.java" does not exist.  Will create one.
> >   [jjtree] File "DOTParserTreeConstants.java" does not exist.  Will
> create one.
> >   [jjtree] File "JJTDOTParserState.java" does not exist.  Will create
> one.
> >   [jjtree] Annotated grammar generated successfully in <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser/DOTParser.jj
> >
> >   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
> >   [javacc] (type "javacc" with no arguments for help)
> >   [javacc] Reading from file <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser/DOTParser.jj>
> . . .
> >   [javacc] File "TokenMgrError.java" does not exist.  Will create one.
> >   [javacc] File "ParseException.java" does not exist.  Will create one.
> >   [javacc] File "Token.java" does not exist.  Will create one.
> >   [javacc] File "SimpleCharStream.java" does not exist.  Will create one.
> >   [javacc] Parser generated successfully.
> >
> > prepare:
> >    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/parser
> >
> >
> > genLexer:
> >
> > genParser:
> >
> > genTreeParser:
> >
> > gen:
> >
> > compile:
> >     [echo] *** Building Main Sources ***
> >     [echo] *** To compile with all warnings enabled, supply
> -Dall.warnings=1 on command line ***
> >     [echo] *** If all.warnings property is supplied,
> compile-sources-all-warnings target will be executed ***
> >     [echo] *** Else, compile-sources (which only warns about
> deprecations) target will be executed ***
> >
> > compile-sources:
> >    [javac] Compiling 689 source files to <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/classes>
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:177:
> cannot find symbol
> >    [javac] symbol  : class PIntTuple
> >    [javac] location: class org.apache.pig.data.TupleFactory
> >    [javac]                 return new PIntTuple();
> >    [javac]                            ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:179:
> cannot find symbol
> >    [javac] symbol  : class PFloatTuple
> >    [javac] location: class org.apache.pig.data.TupleFactory
> >    [javac]                 return new PFloatTuple();
> >    [javac]                            ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:181:
> cannot find symbol
> >    [javac] symbol  : class PLongTuple
> >    [javac] location: class org.apache.pig.data.TupleFactory
> >    [javac]                 return new PLongTuple();
> >    [javac]                            ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:183:
> cannot find symbol
> >    [javac] symbol  : class PDoubleTuple
> >    [javac] location: class org.apache.pig.data.TupleFactory
> >    [javac]                 return new PDoubleTuple();
> >    [javac]                            ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:185:
> cannot find symbol
> >    [javac] symbol  : class PStringTuple
> >    [javac] location: class org.apache.pig.data.TupleFactory
> >    [javac]                 return new PStringTuple();
> >    [javac]                            ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:187:
> cannot find symbol
> >    [javac] symbol  : class PBooleanTuple
> >    [javac] location: class org.apache.pig.data.TupleFactory
> >    [javac]                 return new PBooleanTuple();
> >    [javac]                            ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:196:
> cannot find symbol
> >    [javac] symbol  : class PrimitiveTuple
> >    [javac] location: class org.apache.pig.data.TupleFactory
> >    [javac]             return allNumbers ? new PrimitiveTuple(dataTypes)
> : this.newTuple(dataTypes.length);
> >    [javac]                                     ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:125:
> cannot find symbol
> >    [javac] symbol  : class PrimitiveTuple
> >    [javac] location: class org.apache.pig.data.BinInterSedes
> >    [javac]         PrimitiveTuple t = new PrimitiveTuple();
> >    [javac]         ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:125:
> cannot find symbol
> >    [javac] symbol  : class PrimitiveTuple
> >    [javac] location: class org.apache.pig.data.BinInterSedes
> >    [javac]         PrimitiveTuple t = new PrimitiveTuple();
> >    [javac]                                ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:361:
> cannot find symbol
> >    [javac] symbol  : class PIntTuple
> >    [javac] location: class org.apache.pig.data.BinInterSedes
> >    [javac]             Tuple t = new PIntTuple();
> >    [javac]                           ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:365:
> cannot find symbol
> >    [javac] symbol  : class PFloatTuple
> >    [javac] location: class org.apache.pig.data.BinInterSedes
> >    [javac]             t = new PFloatTuple();
> >    [javac]                     ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:369:
> cannot find symbol
> >    [javac] symbol  : class PLongTuple
> >    [javac] location: class org.apache.pig.data.BinInterSedes
> >    [javac]             t = new PLongTuple();
> >    [javac]                     ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:373:
> cannot find symbol
> >    [javac] symbol  : class PDoubleTuple
> >    [javac] location: class org.apache.pig.data.BinInterSedes
> >    [javac]             t = new PDoubleTuple();
> >    [javac]                     ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:377:
> cannot find symbol
> >    [javac] symbol  : class PStringTuple
> >    [javac] location: class org.apache.pig.data.BinInterSedes
> >    [javac]             t = new PStringTuple();
> >    [javac]                     ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:381:
> cannot find symbol
> >    [javac] symbol  : class PBooleanTuple
> >    [javac] location: class org.apache.pig.data.BinInterSedes
> >    [javac]             t = new PBooleanTuple();
> >    [javac]                     ^
> >    [javac] <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:577:
> cannot find symbol
> >    [javac] symbol  : class TypeAwareTuple
> >    [javac] location: class org.apache.pig.data.BinInterSedes
> >    [javac]         if (t instanceof TypeAwareTuple) {
> >    [javac]                          ^
> >    [javac] Note: Some input files use or override a deprecated API.
> >    [javac] Note: Recompile with -Xlint:deprecation for details.
> >    [javac] Note: Some input files use unchecked or unsafe operations.
> >    [javac] Note: Recompile with -Xlint:unchecked for details.
> >    [javac] 16 errors
> >
> > BUILD FAILED
> > <https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:441: The
> following error occurred while executing this line:
> > <https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:489:
> Compile failed; see the compiler error output for details.
> >
> > Total time: 24 seconds
> >
> >
> > ======================================================================
> > ======================================================================
> > STORE: saving artifacts
> > ======================================================================
> > ======================================================================
> >
> >
> > mv: cannot stat `build/*.tar.gz': No such file or directory
> > mv: cannot stat `build/*.jar': No such file or directory
> > mv: cannot stat `build/test/findbugs': No such file or directory
> > mv: cannot stat `build/docs/api': No such file or directory
> > Build Failed
> > Build step 'Execute shell' marked build as failure
> > [FINDBUGS] Skipping publisher since build result is FAILURE
> > Recording test results
> > Publishing Javadoc
> > Archiving artifacts
> > Recording fingerprints
> > Publishing Clover coverage report...
> > No Clover report will be published due to a Build Failure
> >
>

Re: Build failed in Jenkins: Pig-trunk #1173

Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Ugh. I forgot to "svn add" the new files. Will do that once  I'm online in a few hours. 

On Jan 17, 2012, at 2:06 AM, Apache Jenkins Server <je...@builds.apache.org> wrote:

> See <https://builds.apache.org/job/Pig-trunk/1173/changes>
> 
> Changes:
> 
> [gates] Remove zero length file.
> 
> [dvryaboy] PIG-2359: Support more efficient Tuples when schemas are known
> 
> ------------------------------------------
> [...truncated 2437 lines...]
> [ivy:resolve]    found net.sf.kosmosfs#kfs;0.3 in default
> [ivy:resolve]    found hsqldb#hsqldb;1.8.0.10 in maven2
> [ivy:resolve]    found org.apache.hadoop#hadoop-test;1.0.0 in maven2
> [ivy:resolve]    found org.apache.ftpserver#ftplet-api;1.0.0 in maven2
> [ivy:resolve]    found org.apache.mina#mina-core;2.0.0-M5 in maven2
> [ivy:resolve]    found org.slf4j#slf4j-api;1.5.2 in maven2
> [ivy:resolve]    found org.apache.ftpserver#ftpserver-core;1.0.0 in maven2
> [ivy:resolve]    found org.apache.ftpserver#ftpserver-deprecated;1.0.0-M2 in maven2
> [ivy:resolve]    found org.apache.httpcomponents#httpclient;4.1 in maven2
> [ivy:resolve]    found org.apache.httpcomponents#httpcore;4.1 in maven2
> [ivy:resolve]    found log4j#log4j;1.2.16 in fs
> [ivy:resolve]    found org.slf4j#slf4j-log4j12;1.6.1 in fs
> [ivy:resolve]    found org.apache.avro#avro;1.5.3 in default
> [ivy:resolve]    found org.codehaus.jackson#jackson-mapper-asl;1.7.3 in maven2
> [ivy:resolve]    found org.codehaus.jackson#jackson-core-asl;1.7.3 in maven2
> [ivy:resolve]    found com.thoughtworks.paranamer#paranamer;2.3 in default
> [ivy:resolve]    found org.xerial.snappy#snappy-java;1.0.3.2 in default
> [ivy:resolve]    found org.slf4j#slf4j-api;1.6.1 in default
> [ivy:resolve]    found com.googlecode.json-simple#json-simple;1.1 in maven2
> [ivy:resolve]    found com.jcraft#jsch;0.1.38 in maven2
> [ivy:resolve]    found jline#jline;0.9.94 in default
> [ivy:resolve]    found net.java.dev.javacc#javacc;4.2 in maven2
> [ivy:resolve]    found joda-time#joda-time;1.6 in maven2
> [ivy:resolve]    found com.google.guava#guava;11.0 in maven2
> [ivy:resolve]    found org.python#jython;2.5.0 in maven2
> [ivy:resolve]    found rhino#js;1.7R2 in maven2
> [ivy:resolve]    found org.antlr#antlr;3.4 in maven2
> [ivy:resolve]    found org.antlr#antlr-runtime;3.4 in maven2
> [ivy:resolve]    found org.antlr#stringtemplate;3.2.1 in maven2
> [ivy:resolve]    found antlr#antlr;2.7.7 in maven2
> [ivy:resolve]    found org.antlr#ST4;4.0.4 in maven2
> [ivy:resolve]    found org.apache.zookeeper#zookeeper;3.3.3 in maven2
> [ivy:resolve]    found org.jboss.netty#netty;3.2.2.Final in maven2
> [ivy:resolve]    found dk.brics.automaton#automaton;1.11-8 in maven2
> [ivy:resolve]    found org.apache.hbase#hbase;0.90.0 in maven2
> [ivy:resolve]    found org.vafer#jdeb;0.8 in maven2
> [ivy:resolve] :: resolution report :: resolve 1323ms :: artifacts dl 47ms
> [ivy:resolve]    :: evicted modules:
> [ivy:resolve]    commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [compile]
> [ivy:resolve]    commons-codec#commons-codec;1.2 by [commons-codec#commons-codec;1.4] in [compile]
> [ivy:resolve]    commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [compile]
> [ivy:resolve]    commons-codec#commons-codec;1.3 by [commons-codec#commons-codec;1.4] in [compile]
> [ivy:resolve]    commons-httpclient#commons-httpclient;3.1 by [commons-httpclient#commons-httpclient;3.0.1] in [compile]
> [ivy:resolve]    org.codehaus.jackson#jackson-mapper-asl;1.0.1 by [org.codehaus.jackson#jackson-mapper-asl;1.7.3] in [compile]
> [ivy:resolve]    org.slf4j#slf4j-api;1.5.2 by [org.slf4j#slf4j-api;1.6.1] in [compile]
> [ivy:resolve]    org.apache.mina#mina-core;2.0.0-M4 by [org.apache.mina#mina-core;2.0.0-M5] in [compile]
> [ivy:resolve]    org.apache.ftpserver#ftplet-api;1.0.0-M2 by [org.apache.ftpserver#ftplet-api;1.0.0] in [compile]
> [ivy:resolve]    org.apache.ftpserver#ftpserver-core;1.0.0-M2 by [org.apache.ftpserver#ftpserver-core;1.0.0] in [compile]
> [ivy:resolve]    org.apache.mina#mina-core;2.0.0-M2 by [org.apache.mina#mina-core;2.0.0-M5] in [compile]
> [ivy:resolve]    commons-cli#commons-cli;1.0 by [commons-cli#commons-cli;1.2] in [compile]
> [ivy:resolve]    org.antlr#antlr-runtime;3.3 by [org.antlr#antlr-runtime;3.4] in [compile]
>    ---------------------------------------------------------------------
>    |                  |            modules            ||   artifacts   |
>    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
>    ---------------------------------------------------------------------
>    |      compile     |   77  |   0   |   0   |   13  ||   65  |   0   |
>    ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.pig#Pig
> [ivy:retrieve]    confs: [compile]
> [ivy:retrieve]    65 artifacts copied, 0 already retrieved (38935kB/171ms)
> [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
> [ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Pig-trunk/ws/trunk/ivy/ivysettings.xml>
> 
> init:
>    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/impl/logicalLayer/parser>
>    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/tools/pigscript/parser>
>    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/tools/parameters>
>    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/pig-2012-01-17_10-05-52>
>    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/classes>
>    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/classes>
>    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser>
>    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/data/parser>
>     [move] Moving 1 file to <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/ivy/lib/Pig>
> 
> cc-compile:
>   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
>   [javacc] (type "javacc" with no arguments for help)
>   [javacc] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/tools/pigscript/parser/PigScriptParser.jj> . . .
>   [javacc] File "TokenMgrError.java" does not exist.  Will create one.
>   [javacc] File "ParseException.java" does not exist.  Will create one.
>   [javacc] File "Token.java" does not exist.  Will create one.
>   [javacc] File "JavaCharStream.java" does not exist.  Will create one.
>   [javacc] Parser generated successfully.
>   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
>   [javacc] (type "javacc" with no arguments for help)
>   [javacc] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/tools/parameters/PigFileParser.jj> . . .
>   [javacc] Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD is more than 1.  Set option FORCE_LA_CHECK to true to force checking.
>   [javacc] File "TokenMgrError.java" does not exist.  Will create one.
>   [javacc] File "ParseException.java" does not exist.  Will create one.
>   [javacc] File "Token.java" does not exist.  Will create one.
>   [javacc] File "JavaCharStream.java" does not exist.  Will create one.
>   [javacc] Parser generated with 0 errors and 1 warnings.
>   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
>   [javacc] (type "javacc" with no arguments for help)
>   [javacc] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/tools/parameters/ParamLoader.jj> . . .
>   [javacc] File "TokenMgrError.java" is being rebuilt.
>   [javacc] File "ParseException.java" is being rebuilt.
>   [javacc] File "Token.java" is being rebuilt.
>   [javacc] File "JavaCharStream.java" is being rebuilt.
>   [javacc] Parser generated successfully.
>   [jjtree] Java Compiler Compiler Version 4.2 (Tree Builder)
>   [jjtree] (type "jjtree" with no arguments for help)
>   [jjtree] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/DOTParser.jjt> . . .
>   [jjtree] File "Node.java" does not exist.  Will create one.
>   [jjtree] File "SimpleNode.java" does not exist.  Will create one.
>   [jjtree] File "DOTParserTreeConstants.java" does not exist.  Will create one.
>   [jjtree] File "JJTDOTParserState.java" does not exist.  Will create one.
>   [jjtree] Annotated grammar generated successfully in <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser/DOTParser.jj>
>   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
>   [javacc] (type "javacc" with no arguments for help)
>   [javacc] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser/DOTParser.jj> . . .
>   [javacc] File "TokenMgrError.java" does not exist.  Will create one.
>   [javacc] File "ParseException.java" does not exist.  Will create one.
>   [javacc] File "Token.java" does not exist.  Will create one.
>   [javacc] File "SimpleCharStream.java" does not exist.  Will create one.
>   [javacc] Parser generated successfully.
> 
> prepare:
>    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/parser>
> 
> genLexer:
> 
> genParser:
> 
> genTreeParser:
> 
> gen:
> 
> compile:
>     [echo] *** Building Main Sources ***
>     [echo] *** To compile with all warnings enabled, supply -Dall.warnings=1 on command line ***
>     [echo] *** If all.warnings property is supplied, compile-sources-all-warnings target will be executed ***
>     [echo] *** Else, compile-sources (which only warns about deprecations) target will be executed ***
> 
> compile-sources:
>    [javac] Compiling 689 source files to <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/classes>
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:177: cannot find symbol
>    [javac] symbol  : class PIntTuple
>    [javac] location: class org.apache.pig.data.TupleFactory
>    [javac]                 return new PIntTuple();
>    [javac]                            ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:179: cannot find symbol
>    [javac] symbol  : class PFloatTuple
>    [javac] location: class org.apache.pig.data.TupleFactory
>    [javac]                 return new PFloatTuple();
>    [javac]                            ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:181: cannot find symbol
>    [javac] symbol  : class PLongTuple
>    [javac] location: class org.apache.pig.data.TupleFactory
>    [javac]                 return new PLongTuple();
>    [javac]                            ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:183: cannot find symbol
>    [javac] symbol  : class PDoubleTuple
>    [javac] location: class org.apache.pig.data.TupleFactory
>    [javac]                 return new PDoubleTuple();
>    [javac]                            ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:185: cannot find symbol
>    [javac] symbol  : class PStringTuple
>    [javac] location: class org.apache.pig.data.TupleFactory
>    [javac]                 return new PStringTuple();
>    [javac]                            ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:187: cannot find symbol
>    [javac] symbol  : class PBooleanTuple
>    [javac] location: class org.apache.pig.data.TupleFactory
>    [javac]                 return new PBooleanTuple();
>    [javac]                            ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:196: cannot find symbol
>    [javac] symbol  : class PrimitiveTuple
>    [javac] location: class org.apache.pig.data.TupleFactory
>    [javac]             return allNumbers ? new PrimitiveTuple(dataTypes) : this.newTuple(dataTypes.length);
>    [javac]                                     ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:125: cannot find symbol
>    [javac] symbol  : class PrimitiveTuple
>    [javac] location: class org.apache.pig.data.BinInterSedes
>    [javac]         PrimitiveTuple t = new PrimitiveTuple();
>    [javac]         ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:125: cannot find symbol
>    [javac] symbol  : class PrimitiveTuple
>    [javac] location: class org.apache.pig.data.BinInterSedes
>    [javac]         PrimitiveTuple t = new PrimitiveTuple();
>    [javac]                                ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:361: cannot find symbol
>    [javac] symbol  : class PIntTuple
>    [javac] location: class org.apache.pig.data.BinInterSedes
>    [javac]             Tuple t = new PIntTuple();
>    [javac]                           ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:365: cannot find symbol
>    [javac] symbol  : class PFloatTuple
>    [javac] location: class org.apache.pig.data.BinInterSedes
>    [javac]             t = new PFloatTuple();
>    [javac]                     ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:369: cannot find symbol
>    [javac] symbol  : class PLongTuple
>    [javac] location: class org.apache.pig.data.BinInterSedes
>    [javac]             t = new PLongTuple();
>    [javac]                     ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:373: cannot find symbol
>    [javac] symbol  : class PDoubleTuple
>    [javac] location: class org.apache.pig.data.BinInterSedes
>    [javac]             t = new PDoubleTuple();
>    [javac]                     ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:377: cannot find symbol
>    [javac] symbol  : class PStringTuple
>    [javac] location: class org.apache.pig.data.BinInterSedes
>    [javac]             t = new PStringTuple();
>    [javac]                     ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:381: cannot find symbol
>    [javac] symbol  : class PBooleanTuple
>    [javac] location: class org.apache.pig.data.BinInterSedes
>    [javac]             t = new PBooleanTuple();
>    [javac]                     ^
>    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:577: cannot find symbol
>    [javac] symbol  : class TypeAwareTuple
>    [javac] location: class org.apache.pig.data.BinInterSedes
>    [javac]         if (t instanceof TypeAwareTuple) {
>    [javac]                          ^
>    [javac] Note: Some input files use or override a deprecated API.
>    [javac] Note: Recompile with -Xlint:deprecation for details.
>    [javac] Note: Some input files use unchecked or unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>    [javac] 16 errors
> 
> BUILD FAILED
> <https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:441: The following error occurred while executing this line:
> <https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:489: Compile failed; see the compiler error output for details.
> 
> Total time: 24 seconds
> 
> 
> ======================================================================
> ======================================================================
> STORE: saving artifacts
> ======================================================================
> ======================================================================
> 
> 
> mv: cannot stat `build/*.tar.gz': No such file or directory
> mv: cannot stat `build/*.jar': No such file or directory
> mv: cannot stat `build/test/findbugs': No such file or directory
> mv: cannot stat `build/docs/api': No such file or directory
> Build Failed
> Build step 'Execute shell' marked build as failure
> [FINDBUGS] Skipping publisher since build result is FAILURE
> Recording test results
> Publishing Javadoc
> Archiving artifacts
> Recording fingerprints
> Publishing Clover coverage report...
> No Clover report will be published due to a Build Failure
> 

Build failed in Jenkins: Pig-trunk #1173

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Pig-trunk/1173/changes>

Changes:

[gates] Remove zero length file.

[dvryaboy] PIG-2359: Support more efficient Tuples when schemas are known

------------------------------------------
[...truncated 2437 lines...]
[ivy:resolve] 	found net.sf.kosmosfs#kfs;0.3 in default
[ivy:resolve] 	found hsqldb#hsqldb;1.8.0.10 in maven2
[ivy:resolve] 	found org.apache.hadoop#hadoop-test;1.0.0 in maven2
[ivy:resolve] 	found org.apache.ftpserver#ftplet-api;1.0.0 in maven2
[ivy:resolve] 	found org.apache.mina#mina-core;2.0.0-M5 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.5.2 in maven2
[ivy:resolve] 	found org.apache.ftpserver#ftpserver-core;1.0.0 in maven2
[ivy:resolve] 	found org.apache.ftpserver#ftpserver-deprecated;1.0.0-M2 in maven2
[ivy:resolve] 	found org.apache.httpcomponents#httpclient;4.1 in maven2
[ivy:resolve] 	found org.apache.httpcomponents#httpcore;4.1 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.16 in fs
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.6.1 in fs
[ivy:resolve] 	found org.apache.avro#avro;1.5.3 in default
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.7.3 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.7.3 in maven2
[ivy:resolve] 	found com.thoughtworks.paranamer#paranamer;2.3 in default
[ivy:resolve] 	found org.xerial.snappy#snappy-java;1.0.3.2 in default
[ivy:resolve] 	found org.slf4j#slf4j-api;1.6.1 in default
[ivy:resolve] 	found com.googlecode.json-simple#json-simple;1.1 in maven2
[ivy:resolve] 	found com.jcraft#jsch;0.1.38 in maven2
[ivy:resolve] 	found jline#jline;0.9.94 in default
[ivy:resolve] 	found net.java.dev.javacc#javacc;4.2 in maven2
[ivy:resolve] 	found joda-time#joda-time;1.6 in maven2
[ivy:resolve] 	found com.google.guava#guava;11.0 in maven2
[ivy:resolve] 	found org.python#jython;2.5.0 in maven2
[ivy:resolve] 	found rhino#js;1.7R2 in maven2
[ivy:resolve] 	found org.antlr#antlr;3.4 in maven2
[ivy:resolve] 	found org.antlr#antlr-runtime;3.4 in maven2
[ivy:resolve] 	found org.antlr#stringtemplate;3.2.1 in maven2
[ivy:resolve] 	found antlr#antlr;2.7.7 in maven2
[ivy:resolve] 	found org.antlr#ST4;4.0.4 in maven2
[ivy:resolve] 	found org.apache.zookeeper#zookeeper;3.3.3 in maven2
[ivy:resolve] 	found org.jboss.netty#netty;3.2.2.Final in maven2
[ivy:resolve] 	found dk.brics.automaton#automaton;1.11-8 in maven2
[ivy:resolve] 	found org.apache.hbase#hbase;0.90.0 in maven2
[ivy:resolve] 	found org.vafer#jdeb;0.8 in maven2
[ivy:resolve] :: resolution report :: resolve 1323ms :: artifacts dl 47ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [compile]
[ivy:resolve] 	commons-codec#commons-codec;1.2 by [commons-codec#commons-codec;1.4] in [compile]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [compile]
[ivy:resolve] 	commons-codec#commons-codec;1.3 by [commons-codec#commons-codec;1.4] in [compile]
[ivy:resolve] 	commons-httpclient#commons-httpclient;3.1 by [commons-httpclient#commons-httpclient;3.0.1] in [compile]
[ivy:resolve] 	org.codehaus.jackson#jackson-mapper-asl;1.0.1 by [org.codehaus.jackson#jackson-mapper-asl;1.7.3] in [compile]
[ivy:resolve] 	org.slf4j#slf4j-api;1.5.2 by [org.slf4j#slf4j-api;1.6.1] in [compile]
[ivy:resolve] 	org.apache.mina#mina-core;2.0.0-M4 by [org.apache.mina#mina-core;2.0.0-M5] in [compile]
[ivy:resolve] 	org.apache.ftpserver#ftplet-api;1.0.0-M2 by [org.apache.ftpserver#ftplet-api;1.0.0] in [compile]
[ivy:resolve] 	org.apache.ftpserver#ftpserver-core;1.0.0-M2 by [org.apache.ftpserver#ftpserver-core;1.0.0] in [compile]
[ivy:resolve] 	org.apache.mina#mina-core;2.0.0-M2 by [org.apache.mina#mina-core;2.0.0-M5] in [compile]
[ivy:resolve] 	commons-cli#commons-cli;1.0 by [commons-cli#commons-cli;1.2] in [compile]
[ivy:resolve] 	org.antlr#antlr-runtime;3.3 by [org.antlr#antlr-runtime;3.4] in [compile]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      compile     |   77  |   0   |   0   |   13  ||   65  |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.pig#Pig
[ivy:retrieve] 	confs: [compile]
[ivy:retrieve] 	65 artifacts copied, 0 already retrieved (38935kB/171ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Pig-trunk/ws/trunk/ivy/ivysettings.xml>

init:
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/impl/logicalLayer/parser>
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/tools/pigscript/parser>
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/tools/parameters>
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/pig-2012-01-17_10-05-52>
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/classes>
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/classes>
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser>
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/data/parser>
     [move] Moving 1 file to <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/ivy/lib/Pig>

cc-compile:
   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
   [javacc] (type "javacc" with no arguments for help)
   [javacc] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/tools/pigscript/parser/PigScriptParser.jj> . . .
   [javacc] File "TokenMgrError.java" does not exist.  Will create one.
   [javacc] File "ParseException.java" does not exist.  Will create one.
   [javacc] File "Token.java" does not exist.  Will create one.
   [javacc] File "JavaCharStream.java" does not exist.  Will create one.
   [javacc] Parser generated successfully.
   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
   [javacc] (type "javacc" with no arguments for help)
   [javacc] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/tools/parameters/PigFileParser.jj> . . .
   [javacc] Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD is more than 1.  Set option FORCE_LA_CHECK to true to force checking.
   [javacc] File "TokenMgrError.java" does not exist.  Will create one.
   [javacc] File "ParseException.java" does not exist.  Will create one.
   [javacc] File "Token.java" does not exist.  Will create one.
   [javacc] File "JavaCharStream.java" does not exist.  Will create one.
   [javacc] Parser generated with 0 errors and 1 warnings.
   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
   [javacc] (type "javacc" with no arguments for help)
   [javacc] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/tools/parameters/ParamLoader.jj> . . .
   [javacc] File "TokenMgrError.java" is being rebuilt.
   [javacc] File "ParseException.java" is being rebuilt.
   [javacc] File "Token.java" is being rebuilt.
   [javacc] File "JavaCharStream.java" is being rebuilt.
   [javacc] Parser generated successfully.
   [jjtree] Java Compiler Compiler Version 4.2 (Tree Builder)
   [jjtree] (type "jjtree" with no arguments for help)
   [jjtree] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/DOTParser.jjt> . . .
   [jjtree] File "Node.java" does not exist.  Will create one.
   [jjtree] File "SimpleNode.java" does not exist.  Will create one.
   [jjtree] File "DOTParserTreeConstants.java" does not exist.  Will create one.
   [jjtree] File "JJTDOTParserState.java" does not exist.  Will create one.
   [jjtree] Annotated grammar generated successfully in <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser/DOTParser.jj>
   [javacc] Java Compiler Compiler Version 4.2 (Parser Generator)
   [javacc] (type "javacc" with no arguments for help)
   [javacc] Reading from file <https://builds.apache.org/job/Pig-trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser/DOTParser.jj> . . .
   [javacc] File "TokenMgrError.java" does not exist.  Will create one.
   [javacc] File "ParseException.java" does not exist.  Will create one.
   [javacc] File "Token.java" does not exist.  Will create one.
   [javacc] File "SimpleCharStream.java" does not exist.  Will create one.
   [javacc] Parser generated successfully.

prepare:
    [mkdir] Created dir: <https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen/org/apache/pig/parser>

genLexer:

genParser:

genTreeParser:

gen:

compile:
     [echo] *** Building Main Sources ***
     [echo] *** To compile with all warnings enabled, supply -Dall.warnings=1 on command line ***
     [echo] *** If all.warnings property is supplied, compile-sources-all-warnings target will be executed ***
     [echo] *** Else, compile-sources (which only warns about deprecations) target will be executed ***

compile-sources:
    [javac] Compiling 689 source files to <https://builds.apache.org/job/Pig-trunk/ws/trunk/build/classes>
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:177: cannot find symbol
    [javac] symbol  : class PIntTuple
    [javac] location: class org.apache.pig.data.TupleFactory
    [javac]                 return new PIntTuple();
    [javac]                            ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:179: cannot find symbol
    [javac] symbol  : class PFloatTuple
    [javac] location: class org.apache.pig.data.TupleFactory
    [javac]                 return new PFloatTuple();
    [javac]                            ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:181: cannot find symbol
    [javac] symbol  : class PLongTuple
    [javac] location: class org.apache.pig.data.TupleFactory
    [javac]                 return new PLongTuple();
    [javac]                            ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:183: cannot find symbol
    [javac] symbol  : class PDoubleTuple
    [javac] location: class org.apache.pig.data.TupleFactory
    [javac]                 return new PDoubleTuple();
    [javac]                            ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:185: cannot find symbol
    [javac] symbol  : class PStringTuple
    [javac] location: class org.apache.pig.data.TupleFactory
    [javac]                 return new PStringTuple();
    [javac]                            ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:187: cannot find symbol
    [javac] symbol  : class PBooleanTuple
    [javac] location: class org.apache.pig.data.TupleFactory
    [javac]                 return new PBooleanTuple();
    [javac]                            ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/TupleFactory.java>:196: cannot find symbol
    [javac] symbol  : class PrimitiveTuple
    [javac] location: class org.apache.pig.data.TupleFactory
    [javac]             return allNumbers ? new PrimitiveTuple(dataTypes) : this.newTuple(dataTypes.length);
    [javac]                                     ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:125: cannot find symbol
    [javac] symbol  : class PrimitiveTuple
    [javac] location: class org.apache.pig.data.BinInterSedes
    [javac]         PrimitiveTuple t = new PrimitiveTuple();
    [javac]         ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:125: cannot find symbol
    [javac] symbol  : class PrimitiveTuple
    [javac] location: class org.apache.pig.data.BinInterSedes
    [javac]         PrimitiveTuple t = new PrimitiveTuple();
    [javac]                                ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:361: cannot find symbol
    [javac] symbol  : class PIntTuple
    [javac] location: class org.apache.pig.data.BinInterSedes
    [javac]             Tuple t = new PIntTuple();
    [javac]                           ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:365: cannot find symbol
    [javac] symbol  : class PFloatTuple
    [javac] location: class org.apache.pig.data.BinInterSedes
    [javac]             t = new PFloatTuple();
    [javac]                     ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:369: cannot find symbol
    [javac] symbol  : class PLongTuple
    [javac] location: class org.apache.pig.data.BinInterSedes
    [javac]             t = new PLongTuple();
    [javac]                     ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:373: cannot find symbol
    [javac] symbol  : class PDoubleTuple
    [javac] location: class org.apache.pig.data.BinInterSedes
    [javac]             t = new PDoubleTuple();
    [javac]                     ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:377: cannot find symbol
    [javac] symbol  : class PStringTuple
    [javac] location: class org.apache.pig.data.BinInterSedes
    [javac]             t = new PStringTuple();
    [javac]                     ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:381: cannot find symbol
    [javac] symbol  : class PBooleanTuple
    [javac] location: class org.apache.pig.data.BinInterSedes
    [javac]             t = new PBooleanTuple();
    [javac]                     ^
    [javac] <https://builds.apache.org/job/Pig-trunk/ws/trunk/src/org/apache/pig/data/BinInterSedes.java>:577: cannot find symbol
    [javac] symbol  : class TypeAwareTuple
    [javac] location: class org.apache.pig.data.BinInterSedes
    [javac]         if (t instanceof TypeAwareTuple) {
    [javac]                          ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 16 errors

BUILD FAILED
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:441: The following error occurred while executing this line:
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:489: Compile failed; see the compiler error output for details.

Total time: 24 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording test results
Publishing Javadoc
Archiving artifacts
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure


Build failed in Jenkins: Pig-trunk #1172

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Pig-trunk/1172/changes>

Changes:

[daijy] Preparing for release 0.9.2

------------------------------------------
[...truncated 36539 lines...]
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] Shutting down DataNode 2
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/15 10:32:57 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/15 10:32:57 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/15 10:32:57 INFO ipc.Server: Stopping server on 36775
    [junit] 12/01/15 10:32:57 INFO ipc.Server: IPC Server handler 1 on 36775: exiting
    [junit] 12/01/15 10:32:57 INFO ipc.Server: IPC Server handler 0 on 36775: exiting
    [junit] 12/01/15 10:32:57 INFO ipc.Server: Stopping IPC Server listener on 36775
    [junit] 12/01/15 10:32:57 INFO ipc.Server: IPC Server handler 2 on 36775: exiting
    [junit] 12/01/15 10:32:57 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/15 10:32:57 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/15 10:32:57 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:36104, storageID=DS-1024380144-67.195.138.20-36104-1326623070315, infoPort=33850, ipcPort=36775):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/15 10:32:57 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/15 10:32:57 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/15 10:32:57 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/15 10:32:57 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:36104, storageID=DS-1024380144-67.195.138.20-36104-1326623070315, infoPort=33850, ipcPort=36775):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data5/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data6/current'}>
    [junit] 12/01/15 10:32:57 INFO ipc.Server: Stopping server on 36775
    [junit] 12/01/15 10:32:57 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/15 10:32:57 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/15 10:32:57 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/15 10:32:57 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/15 10:32:57 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1930573806
    [junit] Shutting down DataNode 1
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1930573806
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/15 10:32:57 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/15 10:32:57 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/15 10:32:57 INFO ipc.Server: Stopping server on 32851
    [junit] 12/01/15 10:32:57 INFO ipc.Server: IPC Server handler 0 on 32851: exiting
    [junit] 12/01/15 10:32:57 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/15 10:32:57 INFO ipc.Server: Stopping IPC Server listener on 32851
    [junit] 12/01/15 10:32:57 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/15 10:32:57 INFO ipc.Server: IPC Server handler 2 on 32851: exiting
    [junit] 12/01/15 10:32:57 INFO ipc.Server: IPC Server handler 1 on 32851: exiting
    [junit] 12/01/15 10:32:57 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/15 10:32:57 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:56367, storageID=DS-528564730-67.195.138.20-56367-1326623069995, infoPort=54022, ipcPort=32851):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/15 10:32:57 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/15 10:32:58 INFO mapred.TaskTracker: Received 'KillJobAction' for job: job_20120115102430715_0012
    [junit] 12/01/15 10:32:58 WARN mapred.TaskTracker: Unknown job job_20120115102430715_0012 being deleted.
    [junit] 12/01/15 10:32:58 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/15 10:32:58 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/15 10:32:58 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:56367, storageID=DS-528564730-67.195.138.20-56367-1326623069995, infoPort=54022, ipcPort=32851):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data3/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data4/current'}>
    [junit] 12/01/15 10:32:58 INFO ipc.Server: Stopping server on 32851
    [junit] 12/01/15 10:32:58 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/15 10:32:58 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/15 10:32:58 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/15 10:32:58 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/15 10:32:58 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId785459596
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId785459596
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] Shutting down DataNode 0
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/15 10:32:58 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/15 10:32:58 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/15 10:32:58 INFO ipc.Server: Stopping server on 52707
    [junit] 12/01/15 10:32:58 INFO ipc.Server: IPC Server handler 0 on 52707: exiting
    [junit] 12/01/15 10:32:58 INFO ipc.Server: IPC Server handler 2 on 52707: exiting
    [junit] 12/01/15 10:32:58 INFO ipc.Server: IPC Server handler 1 on 52707: exiting
    [junit] 12/01/15 10:32:58 INFO ipc.Server: Stopping IPC Server listener on 52707
    [junit] 12/01/15 10:32:58 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/15 10:32:58 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/15 10:32:58 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/15 10:32:58 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:36673, storageID=DS-921518883-67.195.138.20-36673-1326623069638, infoPort=46911, ipcPort=52707):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/15 10:32:58 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/15 10:32:59 INFO hdfs.StateChange: BLOCK* ask 127.0.0.1:36104 to delete  blk_2720125863714522882_1095 blk_-1452921298201479617_1102
    [junit] 12/01/15 10:32:59 INFO hdfs.StateChange: BLOCK* ask 127.0.0.1:57465 to delete  blk_2720125863714522882_1095 blk_-3999696910781269051_1102 blk_3735021820553579512_1101
    [junit] 12/01/15 10:32:59 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/15 10:32:59 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/15 10:32:59 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:36673, storageID=DS-921518883-67.195.138.20-36673-1326623069638, infoPort=46911, ipcPort=52707):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data1/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data2/current'}>
    [junit] 12/01/15 10:32:59 WARN util.MBeans: Hadoop:service=DataNode,name=DataNodeInfo
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=DataNodeInfo
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.unRegisterMXBean(DataNode.java:513)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:726)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.run(DataNode.java:1442)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 12/01/15 10:32:59 INFO ipc.Server: Stopping server on 52707
    [junit] 12/01/15 10:32:59 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/15 10:32:59 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/15 10:32:59 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/15 10:32:59 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/15 10:32:59 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId975986872
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId975986872
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/15 10:32:59 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/15 10:32:59 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/15 10:32:59 WARN namenode.FSNamesystem: ReplicationMonitor thread received InterruptedException.java.lang.InterruptedException: sleep interrupted
    [junit] 12/01/15 10:32:59 INFO namenode.FSNamesystem: Number of transactions: 502 Total time for transactions(ms): 14Number of transactions batched in Syncs: 158 Number of syncs: 348 SyncTimes(ms): 5023 267 
    [junit] 12/01/15 10:32:59 INFO namenode.DecommissionManager: Interrupted Monitor
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 12/01/15 10:32:59 INFO ipc.Server: Stopping server on 54551
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 0 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 1 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 4 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 2 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 3 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 5 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO ipc.Server: Stopping IPC Server listener on 54551
    [junit] 12/01/15 10:32:59 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 6 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 9 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 8 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO ipc.Server: IPC Server handler 7 on 54551: exiting
    [junit] 12/01/15 10:32:59 INFO metrics.RpcInstrumentation: shut down
    [junit] Tests run: 17, Failures: 3, Errors: 3, Time elapsed: 504.022 sec
    [junit] Test org.apache.pig.test.TestStore FAILED
    [junit] Running org.apache.pig.test.TestStringUDFs
    [junit] 12/01/15 10:33:00 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.NullPointerException
    [junit] 12/01/15 10:33:00 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2
    [junit] 12/01/15 10:33:00 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -1
    [junit] 12/01/15 10:33:00 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -8
    [junit] 12/01/15 10:33:00 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2
    [junit] 12/01/15 10:33:00 WARN builtin.INDEXOF: No logger object provided to UDF: org.apache.pig.builtin.INDEXOF. Failed to process input; error - null
    [junit] 12/01/15 10:33:00 WARN builtin.LAST_INDEX_OF: No logger object provided to UDF: org.apache.pig.builtin.LAST_INDEX_OF. Failed to process input; error - null
    [junit] Tests run: 11, Failures: 0, Errors: 0, Time elapsed: 0.099 sec
   [delete] Deleting directory /tmp/pig_junit_tmp814922942

BUILD FAILED
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:774: The following error occurred while executing this line:
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:831: Tests failed!

Total time: 21 minutes 54 seconds
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording test results
Publishing Javadoc
Archiving artifacts
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure


Build failed in Jenkins: Pig-trunk #1171

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Pig-trunk/1171/changes>

Changes:

[daijy] PIG-2462: getWrappedSplit is incorrectly returning the first split instead of the current split.

[daijy] PIG-2413: e2e test should support testing against two cluster (PIG-2413-4.patch)

------------------------------------------
[...truncated 35749 lines...]
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] Shutting down DataNode 2
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/14 10:33:36 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/14 10:33:36 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/14 10:33:36 INFO ipc.Server: Stopping server on 34109
    [junit] 12/01/14 10:33:36 INFO ipc.Server: IPC Server handler 0 on 34109: exiting
    [junit] 12/01/14 10:33:36 INFO ipc.Server: IPC Server handler 1 on 34109: exiting
    [junit] 12/01/14 10:33:36 INFO ipc.Server: Stopping IPC Server listener on 34109
    [junit] 12/01/14 10:33:36 INFO ipc.Server: IPC Server handler 2 on 34109: exiting
    [junit] 12/01/14 10:33:36 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/14 10:33:36 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/14 10:33:36 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/14 10:33:36 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:33193, storageID=DS-1991953918-67.195.138.20-33193-1326536714960, infoPort=40016, ipcPort=34109):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/14 10:33:36 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/14 10:33:37 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/14 10:33:37 INFO hdfs.StateChange: BLOCK* ask 127.0.0.1:48595 to delete  blk_5467575080486654681_1101 blk_2549828112211598481_1102 blk_-5464651711109568623_1095
    [junit] 12/01/14 10:33:37 INFO hdfs.StateChange: BLOCK* ask 127.0.0.1:33193 to delete  blk_5467575080486654681_1101 blk_2549828112211598481_1102 blk_8563567947340606181_1102 blk_-5464651711109568623_1095
    [junit] 12/01/14 10:33:37 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/14 10:33:37 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:33193, storageID=DS-1991953918-67.195.138.20-33193-1326536714960, infoPort=40016, ipcPort=34109):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data5/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data6/current'}>
    [junit] 12/01/14 10:33:37 INFO ipc.Server: Stopping server on 34109
    [junit] 12/01/14 10:33:37 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/14 10:33:37 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/14 10:33:37 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/14 10:33:37 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/14 10:33:37 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1208395903
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1208395903
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] Shutting down DataNode 1
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/14 10:33:37 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/14 10:33:37 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/14 10:33:37 INFO ipc.Server: Stopping server on 44256
    [junit] 12/01/14 10:33:37 INFO ipc.Server: IPC Server handler 0 on 44256: exiting
    [junit] 12/01/14 10:33:37 INFO ipc.Server: Stopping IPC Server listener on 44256
    [junit] 12/01/14 10:33:37 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/14 10:33:37 INFO ipc.Server: IPC Server handler 2 on 44256: exiting
    [junit] 12/01/14 10:33:37 INFO ipc.Server: IPC Server handler 1 on 44256: exiting
    [junit] 12/01/14 10:33:37 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/14 10:33:37 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/14 10:33:37 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:48595, storageID=DS-883320220-67.195.138.20-48595-1326536714637, infoPort=33333, ipcPort=44256):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/14 10:33:37 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/14 10:33:38 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/14 10:33:38 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/14 10:33:38 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:48595, storageID=DS-883320220-67.195.138.20-48595-1326536714637, infoPort=33333, ipcPort=44256):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data3/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data4/current'}>
    [junit] 12/01/14 10:33:38 INFO ipc.Server: Stopping server on 44256
    [junit] 12/01/14 10:33:38 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/14 10:33:38 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/14 10:33:38 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/14 10:33:38 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/14 10:33:38 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1171305594
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1171305594
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] Shutting down DataNode 0
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/14 10:33:38 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/14 10:33:38 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/14 10:33:38 INFO ipc.Server: Stopping server on 40033
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 0 on 40033: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 1 on 40033: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 2 on 40033: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: Stopping IPC Server listener on 40033
    [junit] 12/01/14 10:33:38 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/14 10:33:38 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/14 10:33:38 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:41388, storageID=DS-1370482241-67.195.138.20-41388-1326536714285, infoPort=51923, ipcPort=40033):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/14 10:33:38 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/14 10:33:38 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/14 10:33:38 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/14 10:33:38 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:41388, storageID=DS-1370482241-67.195.138.20-41388-1326536714285, infoPort=51923, ipcPort=40033):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data1/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data2/current'}>
    [junit] 12/01/14 10:33:38 WARN util.MBeans: Hadoop:service=DataNode,name=DataNodeInfo
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=DataNodeInfo
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.unRegisterMXBean(DataNode.java:513)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:726)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.run(DataNode.java:1442)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 12/01/14 10:33:38 INFO ipc.Server: Stopping server on 40033
    [junit] 12/01/14 10:33:38 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/14 10:33:38 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/14 10:33:38 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/14 10:33:38 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/14 10:33:38 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1955043466
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1955043466
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/14 10:33:38 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/14 10:33:38 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/14 10:33:38 WARN namenode.FSNamesystem: ReplicationMonitor thread received InterruptedException.java.lang.InterruptedException: sleep interrupted
    [junit] 12/01/14 10:33:38 INFO namenode.FSNamesystem: Number of transactions: 502 Total time for transactions(ms): 11Number of transactions batched in Syncs: 159 Number of syncs: 348 SyncTimes(ms): 5042 351 
    [junit] 12/01/14 10:33:38 INFO namenode.DecommissionManager: Interrupted Monitor
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 12/01/14 10:33:38 INFO ipc.Server: Stopping server on 34534
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 0 on 34534: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 1 on 34534: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 2 on 34534: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 4 on 34534: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 3 on 34534: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 5 on 34534: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: Stopping IPC Server listener on 34534
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 9 on 34534: exiting
    [junit] 12/01/14 10:33:38 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 7 on 34534: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 6 on 34534: exiting
    [junit] 12/01/14 10:33:38 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/14 10:33:38 INFO ipc.Server: IPC Server handler 8 on 34534: exiting
    [junit] Tests run: 17, Failures: 3, Errors: 3, Time elapsed: 498.449 sec
    [junit] Test org.apache.pig.test.TestStore FAILED
    [junit] Running org.apache.pig.test.TestStringUDFs
    [junit] 12/01/14 10:33:39 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.NullPointerException
    [junit] 12/01/14 10:33:39 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2
    [junit] 12/01/14 10:33:39 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -1
    [junit] 12/01/14 10:33:39 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -8
    [junit] 12/01/14 10:33:39 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2
    [junit] 12/01/14 10:33:39 WARN builtin.INDEXOF: No logger object provided to UDF: org.apache.pig.builtin.INDEXOF. Failed to process input; error - null
    [junit] 12/01/14 10:33:39 WARN builtin.LAST_INDEX_OF: No logger object provided to UDF: org.apache.pig.builtin.LAST_INDEX_OF. Failed to process input; error - null
    [junit] Tests run: 11, Failures: 0, Errors: 0, Time elapsed: 0.1 sec
   [delete] Deleting directory /tmp/pig_junit_tmp1438306823

BUILD FAILED
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:774: The following error occurred while executing this line:
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:831: Tests failed!

Total time: 22 minutes 1 second
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording test results
Publishing Javadoc
Archiving artifacts
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure


Build failed in Jenkins: Pig-trunk #1170

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Pig-trunk/1170/changes>

Changes:

[daijy] PIG-2472: piggybank unit tests write directly to /tmp (PIG-2472-1.patch)

[daijy] PIG-2472: piggybank unit tests write directly to /tmp

[daijy] PIG-2282: Automatically update Eclipse .classpath file when new libs are added to the classpath through Ivy

[daijy] Pig Requirements Hadoop

------------------------------------------
[...truncated 36077 lines...]
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] Shutting down DataNode 2
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/13 10:33:10 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/13 10:33:10 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/13 10:33:10 INFO ipc.Server: Stopping server on 58160
    [junit] 12/01/13 10:33:10 INFO ipc.Server: IPC Server handler 0 on 58160: exiting
    [junit] 12/01/13 10:33:10 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/13 10:33:10 INFO ipc.Server: IPC Server handler 2 on 58160: exiting
    [junit] 12/01/13 10:33:10 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/13 10:33:10 INFO ipc.Server: IPC Server handler 1 on 58160: exiting
    [junit] 12/01/13 10:33:10 INFO ipc.Server: Stopping IPC Server listener on 58160
    [junit] 12/01/13 10:33:10 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/13 10:33:10 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:45319, storageID=DS-751181898-67.195.138.20-45319-1326450290025, infoPort=48476, ipcPort=58160):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/13 10:33:10 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/13 10:33:10 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/13 10:33:10 INFO datanode.DataNode: Scheduling block blk_4818926255301785799_1102 file build/test/data/dfs/data/data4/current/blk_4818926255301785799 for deletion
    [junit] 12/01/13 10:33:10 INFO datanode.DataNode: Deleted block blk_4818926255301785799_1102 at file build/test/data/dfs/data/data4/current/blk_4818926255301785799
    [junit] 12/01/13 10:33:10 INFO mapred.TaskTracker: Received 'KillJobAction' for job: job_20120113102450403_0012
    [junit] 12/01/13 10:33:10 WARN mapred.TaskTracker: Unknown job job_20120113102450403_0012 being deleted.
    [junit] 12/01/13 10:33:11 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:45319, storageID=DS-751181898-67.195.138.20-45319-1326450290025, infoPort=48476, ipcPort=58160):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data5/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data6/current'}>
    [junit] 12/01/13 10:33:11 INFO ipc.Server: Stopping server on 58160
    [junit] 12/01/13 10:33:11 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/13 10:33:11 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/13 10:33:11 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/13 10:33:11 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/13 10:33:11 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/13 10:33:11 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1554907185
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId1554907185
    [junit] Shutting down DataNode 1
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/13 10:33:11 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/13 10:33:11 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/13 10:33:11 INFO ipc.Server: Stopping server on 58031
    [junit] 12/01/13 10:33:11 INFO ipc.Server: IPC Server handler 0 on 58031: exiting
    [junit] 12/01/13 10:33:11 INFO ipc.Server: Stopping IPC Server listener on 58031
    [junit] 12/01/13 10:33:11 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/13 10:33:11 INFO ipc.Server: IPC Server handler 1 on 58031: exiting
    [junit] 12/01/13 10:33:11 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/13 10:33:11 INFO ipc.Server: IPC Server handler 2 on 58031: exiting
    [junit] 12/01/13 10:33:11 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/13 10:33:11 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:33765, storageID=DS-444219714-67.195.138.20-33765-1326450289700, infoPort=57266, ipcPort=58031):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/13 10:33:11 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/13 10:33:11 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/13 10:33:12 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/13 10:33:12 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:33765, storageID=DS-444219714-67.195.138.20-33765-1326450289700, infoPort=57266, ipcPort=58031):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data3/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data4/current'}>
    [junit] 12/01/13 10:33:12 INFO ipc.Server: Stopping server on 58031
    [junit] 12/01/13 10:33:12 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/13 10:33:12 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/13 10:33:12 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/13 10:33:12 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/13 10:33:12 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId705996848
    [junit] Shutting down DataNode 0
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId705996848
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/13 10:33:12 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/13 10:33:12 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/13 10:33:12 INFO ipc.Server: Stopping server on 53782
    [junit] 12/01/13 10:33:12 INFO ipc.Server: IPC Server handler 0 on 53782: exiting
    [junit] 12/01/13 10:33:12 INFO ipc.Server: Stopping IPC Server listener on 53782
    [junit] 12/01/13 10:33:12 INFO ipc.Server: IPC Server handler 2 on 53782: exiting
    [junit] 12/01/13 10:33:12 INFO ipc.Server: Stopping IPC Server Responder
    [junit] 12/01/13 10:33:12 INFO ipc.Server: IPC Server handler 1 on 53782: exiting
    [junit] 12/01/13 10:33:12 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/13 10:33:12 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 1
    [junit] 12/01/13 10:33:12 WARN datanode.DataNode: DatanodeRegistration(127.0.0.1:48370, storageID=DS-1535564748-67.195.138.20-48370-1326450289352, infoPort=47233, ipcPort=53782):DataXceiveServer:java.nio.channels.AsynchronousCloseException
    [junit] 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
    [junit] 	at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159)
    [junit] 	at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:131)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 
    [junit] 12/01/13 10:33:12 INFO datanode.DataNode: Exiting DataXceiveServer
    [junit] 12/01/13 10:33:12 INFO hdfs.StateChange: BLOCK* ask 127.0.0.1:45319 to delete  blk_4171896279233163244_1102
    [junit] 12/01/13 10:33:12 INFO hdfs.StateChange: BLOCK* ask 127.0.0.1:58176 to delete  blk_5224348605107641995_1095 blk_4171896279233163244_1102 blk_6617870039053190997_1101 blk_4818926255301785799_1102
    [junit] 12/01/13 10:33:12 INFO datanode.DataBlockScanner: Exiting DataBlockScanner thread.
    [junit] 12/01/13 10:33:13 INFO datanode.DataNode: DatanodeRegistration(127.0.0.1:48370, storageID=DS-1535564748-67.195.138.20-48370-1326450289352, infoPort=47233, ipcPort=53782):Finishing DataNode in: FSDataset{dirpath='<https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data1/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data2/current'}>
    [junit] 12/01/13 10:33:13 WARN util.MBeans: Hadoop:service=DataNode,name=DataNodeInfo
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=DataNodeInfo
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.unRegisterMXBean(DataNode.java:513)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:726)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.run(DataNode.java:1442)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 12/01/13 10:33:13 INFO ipc.Server: Stopping server on 53782
    [junit] 12/01/13 10:33:13 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/13 10:33:13 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/13 10:33:13 INFO datanode.FSDatasetAsyncDiskService: Shutting down all async disk service threads...
    [junit] 12/01/13 10:33:13 INFO datanode.FSDatasetAsyncDiskService: All async disk service threads have been shut down.
    [junit] 12/01/13 10:33:13 INFO datanode.DataNode: Waiting for threadgroup to exit, active threads is 0
    [junit] 12/01/13 10:33:13 WARN util.MBeans: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-209017410
    [junit] javax.management.InstanceNotFoundException: Hadoop:service=DataNode,name=FSDatasetState-UndefinedStorageId-209017410
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1094)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:415)
    [junit] 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:403)
    [junit] 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:506)
    [junit] 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:71)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.FSDataset.shutdown(FSDataset.java:1934)
    [junit] 	at org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:788)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:566)
    [junit] 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:550)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:87)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77)
    [junit] 	at org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68)
    [junit] 	at org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:220)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] 12/01/13 10:33:13 WARN datanode.FSDatasetAsyncDiskService: AsyncDiskService has already shut down.
    [junit] 12/01/13 10:33:13 INFO mortbay.log: Stopped SelectChannelConnector@localhost:0
    [junit] 12/01/13 10:33:13 WARN namenode.FSNamesystem: ReplicationMonitor thread received InterruptedException.java.lang.InterruptedException: sleep interrupted
    [junit] 12/01/13 10:33:13 INFO namenode.DecommissionManager: Interrupted Monitor
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 12/01/13 10:33:13 INFO namenode.FSNamesystem: Number of transactions: 502 Total time for transactions(ms): 9Number of transactions batched in Syncs: 157 Number of syncs: 348 SyncTimes(ms): 3628 406 
    [junit] 12/01/13 10:33:13 INFO ipc.Server: Stopping server on 33935
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 0 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 1 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 3 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 4 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 2 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 5 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 6 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 7 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 8 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: IPC Server handler 9 on 33935: exiting
    [junit] 12/01/13 10:33:13 INFO ipc.Server: Stopping IPC Server listener on 33935
    [junit] 12/01/13 10:33:13 INFO metrics.RpcInstrumentation: shut down
    [junit] 12/01/13 10:33:13 INFO ipc.Server: Stopping IPC Server Responder
    [junit] Tests run: 17, Failures: 3, Errors: 3, Time elapsed: 498.066 sec
    [junit] Test org.apache.pig.test.TestStore FAILED
    [junit] Running org.apache.pig.test.TestStringUDFs
    [junit] 12/01/13 10:33:14 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.NullPointerException
    [junit] 12/01/13 10:33:14 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2
    [junit] 12/01/13 10:33:14 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -1
    [junit] 12/01/13 10:33:14 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -8
    [junit] 12/01/13 10:33:14 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2
    [junit] 12/01/13 10:33:14 WARN builtin.INDEXOF: No logger object provided to UDF: org.apache.pig.builtin.INDEXOF. Failed to process input; error - null
    [junit] 12/01/13 10:33:14 WARN builtin.LAST_INDEX_OF: No logger object provided to UDF: org.apache.pig.builtin.LAST_INDEX_OF. Failed to process input; error - null
    [junit] Tests run: 11, Failures: 0, Errors: 0, Time elapsed: 0.101 sec
   [delete] Deleting directory /tmp/pig_junit_tmp573304202

BUILD FAILED
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:774: The following error occurred while executing this line:
<https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:831: Tests failed!

Total time: 22 minutes 1 second
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording test results
Publishing Javadoc
Archiving artifacts
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure