You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@geode.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/06/10 13:36:08 UTC
Build failed in Jenkins: Geode-spark-connector #25
See <https://builds.apache.org/job/Geode-spark-connector/25/changes>
Changes:
[klund] GEODE-837: update tests from JUnit3 to JUnit4
[klund] GEODE-837: delete temporary script
[klund] GEODE-837: update eclipse formatter to match intellij formatter
[ukohlmeyer] GEODE-1377: Refactoring as per review comments
[ukohlmeyer] GEODE-1377: Updating JavaDocs to point to the correct property
[ukohlmeyer] GEODE-1377: Renaming of DistributedSystemConfigProperties to
[klund] GEODE-837: add JUnit4 category
[klund] GEODE-1516: update Eclipse and IntelliJ handling of imports
[klund] GEODE-1416: rename profiles to be Apache Geode
[upthewaterspout] Adding etc/eclipseOrganizeImports.importorder to rat excludes
------------------------------------------
[...truncated 1557 lines...]
M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server;2.2.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.apache.spark#spark-sql_2.10;1.3.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.apache.spark#spark-catalyst_2.10;1.3.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.scala-lang#scala-compiler;2.10.4 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.scalamacros#quasiquotes_2.10;2.0.1 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-column;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-common;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-encoding;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-generator;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving commons-codec#commons-codec;1.5 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-hadoop;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-format;2.2.0-rc1 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-jackson;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.codehaus.jackson#jackson-mapper-asl;1.9.11 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.codehaus.jackson#jackson-core-asl;1.9.11 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.jodd#jodd-core;3.6.3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving commons-net#commons-net;3.1 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.scoverage#scalac-scoverage-runtime_2.10;1.0.4 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.scoverage#scalac-scoverage-plugin_2.10;1.0.4 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.scala-lang#jline;2.10.4 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.fusesource.jansi#jansi;1.4 ...[0m
[0m[[0minfo[0m] [0mDone updating.[0m
[0m[[33mwarn[0m] [0mthere were 7 feature warning(s); re-run with -feature for details[0m
[0m[[33mwarn[0m] [0mone warning found[0m
[0m[[33mwarn[0m] [0mNote: Some input files use unchecked or unsafe operations.[0m
[0m[[33mwarn[0m] [0mNote: Recompile with -Xlint:unchecked for details.[0m
[0m[[0minfo[0m] [0mPackaging /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/scala-2.10/geode-spark-connector_2.10-0.5.0.jar ...[0m
[0m[[0minfo[0m] [0mCompiling 1 Scala source and 5 Java sources to /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-demos/basic-demos/target/scala-2.10/classes...[0m
[0m[[0minfo[0m] [0mDone packaging.[0m
[0m[[0minfo[0m] [0mPackaging /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-demos/basic-demos/target/scala-2.10/basic-demos_2.10-0.5.0.jar ...[0m
[0m[[0minfo[0m] [0mDone packaging.[0m
[0m[[32msuccess[0m] [0mTotal time: 67 s, completed Jun 10, 2016 1:35:28 PM[0m
[0m[[0minfo[0m] [0mCompiling 11 Scala sources and 1 Java source to /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/scala-2.10/test-classes...[0m
[0m[[33mwarn[0m] [0mthere were 2 feature warning(s); re-run with -feature for details[0m
[0m[[33mwarn[0m] [0mone warning found[0m
[info 2016/06/10 13:35:53.015 UTC <pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0x107] StructStreamingResultSender: 10 rows, type=(java.lang.Integer, java.lang.String), type.size=159, data.size=151, row.avg.size=15.1
[info 2016/06/10 13:35:53.757 UTC <pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0x107] StructStreamingResultSender: 10 rows, type=(java.lang.Integer, java.lang.String), type.size=0, data.size=151, row.avg.size=15.1
[info 2016/06/10 13:35:53.761 UTC <pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0x107] StructStreamingResultSender: 0 rows, type=null, type.size=0, data.size=0, row.avg.size=NaN
[info 2016/06/10 13:35:53.791 UTC <pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0x107] StructStreamingResultSender: 10000 rows, type=(java.lang.Integer), type.size=131, data.size=60015, row.avg.size=6.0
[info 2016/06/10 13:35:53.862 UTC <ForkJoinPool-1-worker-43> tid=0x116] sender2: 150 rows, type=(java.lang.Integer), type.size=131, data.size=901, row.avg.size=6.0
[info 2016/06/10 13:35:53.863 UTC <ForkJoinPool-1-worker-29> tid=0x115] sender1: 150 rows, type=(java.lang.Integer), type.size=131, data.size=901, row.avg.size=6.0
[info 2016/06/10 13:35:53.879 UTC <ForkJoinPool-1-worker-43> tid=0x116] sender1: 500 rows, type=(java.lang.Integer), type.size=131, data.size=3001, row.avg.size=6.0
[info 2016/06/10 13:35:53.927 UTC <pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0x107] StructStreamingResultSender: 1000 rows, type=(java.lang.String, java.lang.String), type.size=159, data.size=23792, row.avg.size=23.8
[0m[[0minfo[0m] [0m[32mStructStreamingResultSenderAndCollectorTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer simple data[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer simple data with no type info[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with 0 row[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with 10K rows[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with 10K rows with 2 sender[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with 10K rows with 2 sender with error[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with Exception[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer string pair data with 200 rows[0m[0m
[0m[[0minfo[0m] [0m[32m- DataSerializer usage[0m[0m
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/jenkins/.ivy2/cache/org.apache.logging.log4j/log4j-slf4j-impl/jars/log4j-slf4j-impl-2.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
[info 2016/06/10 13:36:00.506 UTC <pool-7-thread-13-ScalaTest-running-GeodeRDDFunctionsTest> tid=0x107] Save RDD id=0 to region test
[info 2016/06/10 13:36:00.558 UTC <pool-7-thread-13-ScalaTest-running-GeodeRDDFunctionsTest> tid=0x107] Save RDD id=0 to region test
[info 2016/06/10 13:36:00.603 UTC <pool-7-thread-13-ScalaTest-running-GeodeRDDFunctionsTest> tid=0x107] Save RDD id=0 to region test
[info 2016/06/10 13:36:00.642 UTC <pool-7-thread-13-ScalaTest-running-GeodeRDDFunctionsTest> tid=0x107] Save RDD id=0 to region test
[0m[[0minfo[0m] [0m[32mGeodeRDDFunctionsTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- test PairRDDFunction Implicit[0m[0m
[0m[[0minfo[0m] [0m[32m- test RDDFunction Implicit[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodePairRDDWriter[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodeNonPairRDDWriter[0m[0m
[0m[[0minfo[0m] [0m[32m- test PairRDDFunctions.saveToGeode[0m[0m
[0m[[0minfo[0m] [0m[32m- test PairRDDFunctions.saveToGeode w/ opConf[0m[0m
[0m[[0minfo[0m] [0m[32m- test RDDFunctions.saveToGeode[0m[0m
[0m[[0minfo[0m] [0m[32m- test RDDFunctions.saveToGeode w/ opConf[0m[0m
r.type=java.lang.String r=List(/obj_obj_region)
[0m[[0minfo[0m] [0m[32mQueryParserTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- select * from /r1[0m[0m
[0m[[0minfo[0m] [0m[32m- select c2 from /r1[0m[0m
[0m[[0minfo[0m] [0m[32m- select key, value from /r1.entries[0m[0m
[0m[[0minfo[0m] [0m[32m- select c1, c2 from /r1 where col1 > 100 and col2 <= 120 or c3 = 2[0m[0m
[0m[[0minfo[0m] [0m[32m- select * from /r1/r2 where c1 >= 200[0m[0m
[0m[[0minfo[0m] [0m[32m- import io.pivotal select c1, c2, c3 from /r1/r2, /r3/r4 where c1 <= 15 and c2 = 100[0m[0m
[0m[[0minfo[0m] [0m[32m- SELECT distinct f1, f2 FROM /r1/r2 WHere f = 100[0m[0m
[0m[[0minfo[0m] [0m[32m- IMPORT io.pivotal.geode IMPORT com.mypackage SELECT key,value FROM /root/sub.entries WHERE status = 'active' ORDER BY id desc[0m[0m
[0m[[0minfo[0m] [0m[32m- select distinct p.ID, p.status from /region p where p.ID > 5 order by p.status[0m[0m
[0m[[0minfo[0m] [0m[32m- SELECT DISTINCT * FROM /QueryRegion1 r1, /QueryRegion2 r2 WHERE r1.ID = r2.ID[0m[0m
[0m[[0minfo[0m] [0m[32m- SELECT id, "type", positions, status FROM /obj_obj_region WHERE status = 'active'[0m[0m
[0m[[0minfo[0m] [0m[32m- SELECT r.id, r."type", r.positions, r.status FROM /obj_obj_region r, r.positions.values f WHERE r.status = 'active' and f.secId = 'MSFT'[0m[0m
[0m[[0minfo[0m] [0m[32mGeodeConnectionConfTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(SparkConf) w/ GeodeLocator property and empty geodeProps[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(SparkConf) w/ GeodeLocator property and geode properties[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(SparkConf) w/o GeodeLocator property[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(SparkConf) w/ invalid GeodeLocator property[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(locatorStr, geodeProps) w/ valid locatorStr and non geodeProps[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(locatorStr, geodeProps) w/ valid locatorStr and non-empty geodeProps[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(locatorStr, geodeProps) w/ invalid locatorStr[0m[0m
[0m[[0minfo[0m] [0m[32m- constructor w/ empty (host,port) pairs[0m[0m
[0m[[0minfo[0m] [0m[32m- getConnection() normal[0m[0m
[0m[[0minfo[0m] [0m[32m- getConnection() failure[0m[0m
[0m[[0minfo[0m] [0m[32mDefaultGeodeConnectionManagerTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- DefaultGeodeConnectionFactory get/closeConnection[0m[0m
[0m[[0minfo[0m] [0m[32m- DefaultGeodeConnectionFactory newConnection(...) throws RuntimeException[0m[0m
[0m[[0minfo[0m] [0m[32m- DefaultGeodeConnectionFactory close() w/ non-exist connection[0m[0m
[warn 2016/06/10 13:36:01.883 UTC <pool-7-thread-13-ScalaTest-running-GeodeRDDPartitionerTest> tid=0x107] Invalid preferred partitioner name dummy.
[0m[[0minfo[0m] [0m[32mGeodeRDDPartitionerTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- default partitioned region partitioner[0m[0m
[0m[[0minfo[0m] [0m[32m- default replicated region partitioner[0m[0m
[0m[[0minfo[0m] [0m[32m- GeodeRDDPartitioner.apply method[0m[0m
[0m[[0minfo[0m] [0m[32m- OnePartitionPartitioner[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1 & no empty bucket[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1 & 1 empty bucket[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1 & 2 empty bucket[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1 & 5 empty bucket[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1, 4 empty-bucket, non-continuous IDs[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=2, no empty buckets, 3 servers have 1, 2, and 3 buckets[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=3, no empty buckets, 4 servers have 0, 2, 3, and 4 buckets[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.partitions(): metadata = None [0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.partitions(): replicated region [0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.partitions(): partitioned region w/o data [0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.partitions(): partitioned region w/ some data [0m[0m
[info 2016/06/10 13:36:02.002 UTC <pool-7-thread-13-ScalaTest-running-GeodeFunctionDeployerTest> tid=0x107] Invalid jar file:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/file:/somemissingjarfilethatdoesnot.exist
[info 2016/06/10 13:36:02.004 UTC <pool-7-thread-13-ScalaTest-running-GeodeFunctionDeployerTest> tid=0x107] Invalid jar file:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/file:/somemissingjarfilethatdoesnot.exist
[info 2016/06/10 13:36:02.005 UTC <pool-7-thread-13-ScalaTest-running-GeodeFunctionDeployerTest> tid=0x107] Invalid jar file:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/file:/somemissingjarfilethatdoesnot.exist
[0m[[0minfo[0m] [0m[32mGeodeFunctionDeployerTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- jmx url creation[0m[0m
[0m[[0minfo[0m] [0m[32m- missing jar file[0m[0m
[0m[[0minfo[0m] [0m[32m- deploy with missing jar[0m[0m
[0m[[0minfo[0m] [0m[32m- successful mocked deploy[0m[0m
host name: hemera
canonical host name: hemera.apache.org
canonical host name 2: hemera.apache.org
[0m[[0minfo[0m] [0m[32mLocatorHelperTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair hostname w/o domain[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair hostname w/ domain[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair w/ invalid host name[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair w/ valid port[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair w/ invalid port[0m[0m
[0m[[0minfo[0m] [0m[32m- parseLocatorsString with valid locator(s)[0m[0m
[0m[[0minfo[0m] [0m[32m- parseLocatorsString with invalid locator(s)[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: shared servers and one gf-server per host[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: shared servers, one gf-server per host, un-sorted list[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: shared servers and two gf-server per host[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: shared servers, two gf-server per host, un-sorted server list[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: no shared servers and one gf-server per host[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: no shared servers, one gf-server per host, and less gf-server[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: ad-hoc[0m[0m
[0m[[0minfo[0m] [0m[32mConnectorImplicitsTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- implicit map2Properties[0m[0m
[0m[[0minfo[0m] [0m[32m- Test Implicit SparkContext Conversion[0m[0m
[0m[[0minfo[0m] [0m[32m- Test Implicit SQLContext Conversion[0m[0m
[info 2016/06/10 13:36:02.355 UTC <pool-7-thread-13-ScalaTest-running-GeodeRegionRDDTest> tid=0x107] RDD id=0 region=test conn=, env=Map()
[info 2016/06/10 13:36:02.376 UTC <pool-7-thread-13-ScalaTest-running-GeodeRegionRDDTest> tid=0x107] RDD id=0 region=test conn=, env=Map(preferred.partitioner -> OnePartition)
[info 2016/06/10 13:36:02.396 UTC <pool-7-thread-13-ScalaTest-running-GeodeRegionRDDTest> tid=0x107] RDD id=0 region=test conn=, env=Map()
[info 2016/06/10 13:36:02.435 UTC <pool-7-thread-13-ScalaTest-running-GeodeRegionRDDTest> tid=0x107] RDD id=0 region=test conn=, env=Map()
[0m[[0minfo[0m] [0m[32mGeodeRegionRDDTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- create GeodeRDD with non-existing region[0m[0m
[0m[[0minfo[0m] [0m[32m- getPartitions with non-existing region[0m[0m
[0m[[0minfo[0m] [0m[32m- getPartitions with replicated region and not preferred env[0m[0m
[0m[[0minfo[0m] [0m[32m- getPartitions with replicated region and preferred OnePartitionPartitioner[0m[0m
[0m[[0minfo[0m] [0m[32m- getPartitions with partitioned region and not preferred env[0m[0m
[0m[[0minfo[0m] [0m[32m- GeodeRDD.compute() method[0m[0m
[info 2016/06/10 13:36:02.708 UTC <pool-7-thread-13> tid=0x107] Save DStream region=testregion conn=
[info 2016/06/10 13:36:02.798 UTC <pool-7-thread-13> tid=0x107] Save RDD id=0 to region testregion
[info 2016/06/10 13:36:02.839 UTC <pool-7-thread-13> tid=0x107] Save DStream region=testregion conn=
[info 2016/06/10 13:36:02.909 UTC <pool-7-thread-13> tid=0x107] Save RDD id=0 to region testregion
[0m[[0minfo[0m] [0m[32mJavaAPITest:[0m[0m
[0m[[0minfo[0m] [0m[32m- testSparkContextFunction[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaPairDStreamFunctions[0m[0m
[0m[[0minfo[0m] [0m[32m- testSQLContextFunction[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaSparkContextFunctions[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaRDDFunctions[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaDStreamFunctions[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaPairDStreamFunctionsWithTuple2DStream[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaPairRDDFunctions[0m[0m
[info 2016/06/10 13:36:02.949 UTC <pool-7-thread-13-ScalaTest-running-GeodeDStreamFunctionsTest> tid=0x107] Save DStream region=test conn=
[info 2016/06/10 13:36:02.965 UTC <pool-7-thread-13-ScalaTest-running-GeodeDStreamFunctionsTest> tid=0x107] Save DStream region=test conn=
[0m[[0minfo[0m] [0m[32mGeodeDStreamFunctionsTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodePairDStreamFunctions Implicit[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodeDStreamFunctions Implicit[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodePairDStreamFunctions.saveToGeode()[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodeDStreamFunctions.saveToGeode()[0m[0m
[0m[[0minfo[0m] [0mScalaTest[0m
[0m[[0minfo[0m] [0m[36mRun completed in 19 seconds, 652 milliseconds.[0m[0m
[0m[[0minfo[0m] [0m[36mTotal number of tests run: 96[0m[0m
[0m[[0minfo[0m] [0m[36mSuites: completed 12, aborted 0[0m[0m
[0m[[0minfo[0m] [0m[36mTests: succeeded 96, failed 0, canceled 0, ignored 0, pending 0[0m[0m
[0m[[0minfo[0m] [0m[32mAll tests passed.[0m[0m
[0m[[0minfo[0m] [0mPassed: Total 96, Failed 0, Errors 0, Passed 96[0m
[0m[[32msuccess[0m] [0mTotal time: 34 s, completed Jun 10, 2016 1:36:03 PM[0m
[0m[[0minfo[0m] [0mCompiling 9 Scala sources and 4 Java sources to /x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/scala-2.10/it-classes...[0m
[0m[[31merror[0m] [0m/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/java/ittest/io/pivotal/geode/spark/connector/JavaApiIntegrationTest.java:20: error: cannot find symbol[0m
[0m[[31merror[0m] [0mimport com.gemstone.gemfire.distributed.DistributedSystemConfigProperties;[0m
[0m[[31merror[0m] [0m ^[0m
[0m[[31merror[0m] [0m symbol: class DistributedSystemConfigProperties[0m
[0m[[31merror[0m] [0m location: package com.gemstone.gemfire.distributed[0m
[0m[[31merror[0m] [0m1 error[0m
[0m[[31merror[0m] [0m(geode-spark-connector/it:[31mcompile[0m) javac returned nonzero exit code[0m
[0m[[31merror[0m] [0mTotal time: 11 s, completed Jun 10, 2016 1:36:13 PM[0m
Build step 'Execute shell' marked build as failure
Recording test results
Skipped archiving because build is not successful
Jenkins build is back to normal : Geode-spark-connector #28
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Geode-spark-connector/28/changes>
Build failed in Jenkins: Geode-spark-connector #27
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Geode-spark-connector/27/>
------------------------------------------
[...truncated 474 lines...]
Download https://repo1.maven.org/maven2/org/hibernate/hibernate-annotations/3.5.5-Final/hibernate-annotations-3.5.5-Final.pom
Download https://repo1.maven.org/maven2/org/hibernate/hibernate-parent/3.5.5-Final/hibernate-parent-3.5.5-Final.pom
Download https://repo1.maven.org/maven2/org/eclipse/persistence/javax.persistence/2.0.0/javax.persistence-2.0.0.pom
Download https://repo1.maven.org/maven2/org/hibernate/hibernate-core/3.5.5-Final/hibernate-core-3.5.5-Final.pom
Download https://repo1.maven.org/maven2/org/hibernate/hibernate-annotations/3.5.5-Final/hibernate-annotations-3.5.5-Final.jar
Download https://repo1.maven.org/maven2/org/eclipse/persistence/javax.persistence/2.0.0/javax.persistence-2.0.0.jar
Download https://repo1.maven.org/maven2/org/hibernate/hibernate-core/3.5.5-Final/hibernate-core-3.5.5-Final.jar
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:extensions/geode-modules-hibernate:processResources UP-TO-DATE
:extensions/geode-modules-hibernate:classes
:extensions/geode-modules-hibernate:jar
:extensions/geode-modules-hibernate:javadoc
:extensions/geode-modules-hibernate:javadocJar
:extensions/geode-modules-hibernate:sourcesJar
:extensions/geode-modules-hibernate:signArchives SKIPPED
:extensions/geode-modules-hibernate:assemble
:extensions/geode-modules-assembly:distHibernate
:extensions/geode-modules:javadocJar
:extensions/geode-modules:sourcesJar
:extensions/geode-modules:signArchives SKIPPED
:extensions/geode-modules:assemble
:extensions/geode-modules-tomcat7:compileJava
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-catalina/7.0.30/tomcat-catalina-7.0.30.pom
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-coyote/7.0.30/tomcat-coyote-7.0.30.pom
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-juli/7.0.30/tomcat-juli-7.0.30.pom
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-api/7.0.30/tomcat-api-7.0.30.pom
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-util/7.0.30/tomcat-util-7.0.30.pom
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-catalina/7.0.30/tomcat-catalina-7.0.30.jar
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-coyote/7.0.30/tomcat-coyote-7.0.30.jar
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-juli/7.0.30/tomcat-juli-7.0.30.jar
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-api/7.0.30/tomcat-api-7.0.30.jar
Download https://repo1.maven.org/maven2/org/apache/tomcat/tomcat-util/7.0.30/tomcat-util-7.0.30.jar
:extensions/geode-modules-tomcat7:processResources UP-TO-DATE
:extensions/geode-modules-tomcat7:classes
:extensions/geode-modules-tomcat7:jar
:extensions/geode-modules-tomcat7:javadoc
:extensions/geode-modules-tomcat7:javadocJar
:extensions/geode-modules-tomcat7:sourcesJar
:extensions/geode-modules-tomcat7:signArchives SKIPPED
:extensions/geode-modules-tomcat7:assemble
:extensions/geode-modules-assembly:distTcServer
:extensions/geode-modules-assembly:distTcServer30
:extensions/geode-modules-assembly:distTomcat
:extensions/geode-modules-assembly:dist
:geode-assembly:compileJava UP-TO-DATE
:geode-assembly:processResources UP-TO-DATE
:geode-assembly:classes UP-TO-DATE
:geode-assembly:defaultCacheConfig
:geode-assembly:defaultDistributionConfig
:geode-assembly:depsJar
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-analyzers-common/6.0.0/lucene-analyzers-common-6.0.0.pom
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-parent/6.0.0/lucene-parent-6.0.0.pom
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-solr-grandparent/6.0.0/lucene-solr-grandparent-6.0.0.pom
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-core/6.0.0/lucene-core-6.0.0.pom
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-queries/6.0.0/lucene-queries-6.0.0.pom
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-queryparser/6.0.0/lucene-queryparser-6.0.0.pom
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-analyzers-common/6.0.0/lucene-analyzers-common-6.0.0.jar
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-core/6.0.0/lucene-core-6.0.0.jar
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-queries/6.0.0/lucene-queries-6.0.0.jar
Download https://repo1.maven.org/maven2/org/apache/lucene/lucene-queryparser/6.0.0/lucene-queryparser-6.0.0.jar
Download https://repo1.maven.org/maven2/org/springframework/spring-aspects/4.2.4.RELEASE/spring-aspects-4.2.4.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/spring-oxm/4.2.4.RELEASE/spring-oxm-4.2.4.RELEASE.pom
Download https://repo1.maven.org/maven2/commons-fileupload/commons-fileupload/1.3.1/commons-fileupload-1.3.1.pom
Download https://repo1.maven.org/maven2/org/springframework/spring-aop/4.2.4.RELEASE/spring-aop-4.2.4.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/spring-aspects/4.2.4.RELEASE/spring-aspects-4.2.4.RELEASE.jar
Download https://repo1.maven.org/maven2/org/springframework/spring-oxm/4.2.4.RELEASE/spring-oxm-4.2.4.RELEASE.jar
Download https://repo1.maven.org/maven2/commons-fileupload/commons-fileupload/1.3.1/commons-fileupload-1.3.1.jar
Download https://repo1.maven.org/maven2/org/springframework/spring-aop/4.2.4.RELEASE/spring-aop-4.2.4.RELEASE.jar
Download https://repo1.maven.org/maven2/com/fasterxml/classmate/0.9.0/classmate-0.9.0.pom
Download https://repo1.maven.org/maven2/com/fasterxml/jackson/module/jackson-module-scala_2.10/2.1.5/jackson-module-scala_2.10-2.1.5.pom
Download https://repo1.maven.org/maven2/com/mangofactory/swagger-springmvc/0.8.2/swagger-springmvc-0.8.2.pom
Download https://repo1.maven.org/maven2/com/wordnik/swagger-annotations/1.3.2/swagger-annotations-1.3.2.pom
Download https://repo1.maven.org/maven2/com/wordnik/swagger-project_2.10/1.3.2/swagger-project_2.10-1.3.2.pom
Download https://repo1.maven.org/maven2/org/json4s/json4s-ast_2.10/3.2.4/json4s-ast_2.10-3.2.4.pom
Download https://repo1.maven.org/maven2/org/springframework/hateoas/spring-hateoas/0.16.0.RELEASE/spring-hateoas-0.16.0.RELEASE.pom
Download https://repo1.maven.org/maven2/com/wordnik/swagger-core_2.10/1.3.2/swagger-core_2.10-1.3.2.pom
Download https://repo1.maven.org/maven2/org/springframework/spring-aop/3.2.9.RELEASE/spring-aop-3.2.9.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/spring-beans/3.2.9.RELEASE/spring-beans-3.2.9.RELEASE.pom
Download https://repo1.maven.org/maven2/org/json4s/json4s-ext_2.10/3.2.4/json4s-ext_2.10-3.2.4.pom
Download https://repo1.maven.org/maven2/org/json4s/json4s-native_2.10/3.2.4/json4s-native_2.10-3.2.4.pom
Download https://repo1.maven.org/maven2/org/json4s/json4s-jackson_2.10/3.2.4/json4s-jackson_2.10-3.2.4.pom
Download https://repo1.maven.org/maven2/org/json4s/json4s-core_2.10/3.2.4/json4s-core_2.10-3.2.4.pom
Download https://repo1.maven.org/maven2/com/thoughtworks/paranamer/paranamer/2.5.2/paranamer-2.5.2.pom
Download https://repo1.maven.org/maven2/com/thoughtworks/paranamer/paranamer-parent/2.5.2/paranamer-parent-2.5.2.pom
Download https://repo1.maven.org/maven2/com/fasterxml/classmate/0.9.0/classmate-0.9.0.jar
Download https://repo1.maven.org/maven2/com/fasterxml/jackson/module/jackson-module-scala_2.10/2.1.5/jackson-module-scala_2.10-2.1.5.jar
Download https://repo1.maven.org/maven2/com/mangofactory/swagger-springmvc/0.8.2/swagger-springmvc-0.8.2.jar
Download https://repo1.maven.org/maven2/com/wordnik/swagger-annotations/1.3.2/swagger-annotations-1.3.2.jar
Download https://repo1.maven.org/maven2/org/json4s/json4s-ast_2.10/3.2.4/json4s-ast_2.10-3.2.4.jar
Download https://repo1.maven.org/maven2/org/scala-lang/scala-library/2.10.0/scala-library-2.10.0.jar
Download https://repo1.maven.org/maven2/org/scala-lang/scala-reflect/2.10.0/scala-reflect-2.10.0.jar
Download https://repo1.maven.org/maven2/org/springframework/hateoas/spring-hateoas/0.16.0.RELEASE/spring-hateoas-0.16.0.RELEASE.jar
Download https://repo1.maven.org/maven2/com/wordnik/swagger-core_2.10/1.3.2/swagger-core_2.10-1.3.2.jar
Download https://repo1.maven.org/maven2/org/json4s/json4s-ext_2.10/3.2.4/json4s-ext_2.10-3.2.4.jar
Download https://repo1.maven.org/maven2/org/json4s/json4s-native_2.10/3.2.4/json4s-native_2.10-3.2.4.jar
Download https://repo1.maven.org/maven2/org/json4s/json4s-jackson_2.10/3.2.4/json4s-jackson_2.10-3.2.4.jar
Download https://repo1.maven.org/maven2/org/json4s/json4s-core_2.10/3.2.4/json4s-core_2.10-3.2.4.jar
Download https://repo1.maven.org/maven2/com/thoughtworks/paranamer/paranamer/2.5.2/paranamer-2.5.2.jar
Download https://repo1.maven.org/maven2/commons-beanutils/commons-beanutils/1.8.0/commons-beanutils-1.8.0.pom
Download https://repo1.maven.org/maven2/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.pom
Download https://repo1.maven.org/maven2/org/apache/commons/commons-parent/39/commons-parent-39.pom
Download https://repo1.maven.org/maven2/commons-digester/commons-digester/2.1/commons-digester-2.1.pom
Download https://repo1.maven.org/maven2/org/springframework/ldap/spring-ldap-core/1.3.2.RELEASE/spring-ldap-core-1.3.2.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/security/spring-security-config/3.1.7.RELEASE/spring-security-config-3.1.7.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/security/spring-security-ldap/3.1.7.RELEASE/spring-security-ldap-3.1.7.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/security/spring-security-web/3.1.7.RELEASE/spring-security-web-3.1.7.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/spring-tx/4.2.4.RELEASE/spring-tx-4.2.4.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/security/spring-security-core/3.1.7.RELEASE/spring-security-core-3.1.7.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/spring-aop/3.0.7.RELEASE/spring-aop-3.0.7.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/spring-parent/3.0.7.RELEASE/spring-parent-3.0.7.RELEASE.pom
Download https://repo1.maven.org/maven2/org/springframework/spring-asm/3.0.7.RELEASE/spring-asm-3.0.7.RELEASE.pom
Download https://repo1.maven.org/maven2/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar
Download https://repo1.maven.org/maven2/commons-digester/commons-digester/2.1/commons-digester-2.1.jar
Download https://repo1.maven.org/maven2/org/springframework/ldap/spring-ldap-core/1.3.2.RELEASE/spring-ldap-core-1.3.2.RELEASE.jar
Download https://repo1.maven.org/maven2/org/springframework/security/spring-security-config/3.1.7.RELEASE/spring-security-config-3.1.7.RELEASE.jar
Download https://repo1.maven.org/maven2/org/springframework/security/spring-security-ldap/3.1.7.RELEASE/spring-security-ldap-3.1.7.RELEASE.jar
Download https://repo1.maven.org/maven2/org/springframework/security/spring-security-web/3.1.7.RELEASE/spring-security-web-3.1.7.RELEASE.jar
Download https://repo1.maven.org/maven2/org/springframework/spring-tx/4.2.4.RELEASE/spring-tx-4.2.4.RELEASE.jar
Download https://repo1.maven.org/maven2/org/springframework/security/spring-security-core/3.1.7.RELEASE/spring-security-core-3.1.7.RELEASE.jar
:extensions/geode-modules-assembly:compileJava UP-TO-DATE
:extensions/geode-modules-assembly:processResources UP-TO-DATE
:extensions/geode-modules-assembly:classes UP-TO-DATE
:geode-cq:compileJavaNote: <https://builds.apache.org/job/Geode-spark-connector/ws/geode-cq/src/main/java/com/gemstone/gemfire/cache/query/internal/cq/CqQueryImpl.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-cq:processResources
:geode-cq:classes
:geode-junit:compileJava
:geode-junit:processResources UP-TO-DATE
:geode-junit:classes
:geode-lucene:compileJavaNote: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-lucene:processResources
:geode-lucene:classes
:geode-pulse:compileJavaNote: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-pulse:copyGemFireVersionFile
:geode-pulse:processResources
:geode-pulse:classes
:geode-rebalancer:compileJava
Download https://repo1.maven.org/maven2/org/quartz-scheduler/quartz/2.2.1/quartz-2.2.1.pom
Download https://repo1.maven.org/maven2/org/quartz-scheduler/quartz-parent/2.2.1/quartz-parent-2.2.1.pom
Download https://repo1.maven.org/maven2/org/quartz-scheduler/quartz/2.2.1/quartz-2.2.1.jar
:geode-rebalancer:processResources UP-TO-DATE
:geode-rebalancer:classes
:geode-wan:compileJavaNote: <https://builds.apache.org/job/Geode-spark-connector/ws/geode-wan/src/main/java/com/gemstone/gemfire/internal/cache/wan/GatewayReceiverImpl.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: <https://builds.apache.org/job/Geode-spark-connector/ws/geode-wan/src/main/java/com/gemstone/gemfire/cache/client/internal/GatewaySenderBatchOp.java> uses unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-wan:processResources
:geode-wan:classes
:geode-web:compileJava UP-TO-DATE
:geode-web:processResources UP-TO-DATE
:geode-web:classes UP-TO-DATE
:geode-web-api:compileJavaNote: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-web-api:processResources UP-TO-DATE
:geode-web-api:classes
:geode-assembly:docs FAILED
:geode-assembly:gfshDepsJar
:geode-common:javadocJar
:geode-common:sourcesJar
:geode-common:signArchives SKIPPED
:geode-core:javadocJar
:geode-core:raJar
:geode-core:jcaJar
:geode-core:sourcesJar
:geode-core:signArchives SKIPPED
:geode-core:webJar
:geode-cq:jar
:geode-cq:javadoc
:geode-cq:javadocJar
:geode-cq:sourcesJar
:geode-cq:signArchives SKIPPED
:geode-joptsimple:javadocJar
:geode-joptsimple:sourcesJar
:geode-joptsimple:signArchives SKIPPED
:geode-json:javadocJar
:geode-json:sourcesJar
:geode-json:signArchives SKIPPED
:geode-lucene:jar
:geode-lucene:javadoc
:geode-lucene:javadocJar
:geode-lucene:sourcesJar
:geode-lucene:signArchives SKIPPED
:geode-pulse:javadoc
:geode-pulse:javadocJar
:geode-pulse:sourcesJar
:geode-pulse:war
:geode-pulse:signArchives SKIPPED
:geode-wan:jar
:geode-wan:javadoc
:geode-wan:javadocJar
:geode-wan:sourcesJar
:geode-wan:signArchives SKIPPED
:geode-web:javadoc UP-TO-DATE
:geode-web:javadocJar
:geode-web:sourcesJar
:geode-web:war
:geode-web:signArchives SKIPPED
:geode-web-api:jar
:geode-web-api:javadoc
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
:geode-web-api:javadoc FAILED
:geode-web-api:sourcesJar
:geode-web-api:war
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':geode-assembly:docs'.
> unable to create new native thread
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':geode-web-api:javadoc'.
> Javadoc generation failed. Generated Javadoc options file (useful for troubleshooting): '<https://builds.apache.org/job/Geode-spark-connector/ws/geode-web-api/build/tmp/javadoc/javadoc.options'>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
==============================================================================
BUILD FAILED
Total time: 4 mins 45.07 secs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Skipped archiving because build is not successful
Build failed in Jenkins: Geode-spark-connector #26
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Geode-spark-connector/26/changes>
Changes:
[huynhja] GEODE-1377: Fixing spark build issue due to rename of
[bschuchardt] GEODE-1527: Locator javadoc mentions the old "gemfire" command
[bschuchardt] GEODE-1528: CacheFactory javadocs point to the wrong place for
[bschuchardt] Minor javadoc fixes
------------------------------------------
[...truncated 268 lines...]
:geode-core:compileJcaJava
:geode-core:processJcaResources UP-TO-DATE
:geode-core:jcaClasses
:geode-core:jar
:extensions/geode-modules:compileJavaNote: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:extensions/geode-modules:processResources
:extensions/geode-modules:classes
:extensions/geode-modules:jar
:extensions/geode-modules-session-internal:compileJavaNote: <https://builds.apache.org/job/Geode-spark-connector/ws/extensions/geode-modules-session-internal/src/main/java/com/gemstone/gemfire/modules/session/internal/filter/GemfireHttpSession.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:extensions/geode-modules-session-internal:processResources UP-TO-DATE
:extensions/geode-modules-session-internal:classes
:extensions/geode-modules-session-internal:jar
:extensions/geode-modules-session:compileJavaNote: <https://builds.apache.org/job/Geode-spark-connector/ws/extensions/geode-modules-session/src/main/java/com/gemstone/gemfire/modules/session/installer/JarClassLoader.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
:extensions/geode-modules-session:processResources UP-TO-DATE
:extensions/geode-modules-session:classes
:extensions/geode-modules-session:jar
:geode-common:javadoc
:geode-joptsimple:javadoc
:geode-json:javadoc
:geode-core:javadoc
:extensions/geode-modules:javadoc
:extensions/geode-modules-session-internal:javadoc
:extensions/geode-modules-session:javadoc
:extensions/geode-modules-session:javadocJar
:extensions/geode-modules-session:sourcesJar
:extensions/geode-modules-session:signArchives SKIPPED
:extensions/geode-modules-session:assemble
:extensions/geode-modules-assembly:distAppServer
:extensions/geode-modules-hibernate:compileJavaNote: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:extensions/geode-modules-hibernate:processResources UP-TO-DATE
:extensions/geode-modules-hibernate:classes
:extensions/geode-modules-hibernate:jar
:extensions/geode-modules-hibernate:javadoc
:extensions/geode-modules-hibernate:javadocJar
:extensions/geode-modules-hibernate:sourcesJar
:extensions/geode-modules-hibernate:signArchives SKIPPED
:extensions/geode-modules-hibernate:assemble
:extensions/geode-modules-assembly:distHibernate
:extensions/geode-modules:javadocJar
:extensions/geode-modules:sourcesJar
:extensions/geode-modules:signArchives SKIPPED
:extensions/geode-modules:assemble
:extensions/geode-modules-tomcat7:compileJava
:extensions/geode-modules-tomcat7:processResources UP-TO-DATE
:extensions/geode-modules-tomcat7:classes
:extensions/geode-modules-tomcat7:jar
:extensions/geode-modules-tomcat7:javadoc
:extensions/geode-modules-tomcat7:javadocJar
:extensions/geode-modules-tomcat7:sourcesJar
:extensions/geode-modules-tomcat7:signArchives SKIPPED
:extensions/geode-modules-tomcat7:assemble
:extensions/geode-modules-assembly:distTcServer
:extensions/geode-modules-assembly:distTcServer30
:extensions/geode-modules-assembly:distTomcat
:extensions/geode-modules-assembly:dist
:geode-assembly:compileJava UP-TO-DATE
:geode-assembly:processResources UP-TO-DATE
:geode-assembly:classes UP-TO-DATE
:geode-assembly:defaultCacheConfig
:geode-assembly:defaultDistributionConfig
:geode-assembly:depsJar
:extensions/geode-modules-assembly:compileJava UP-TO-DATE
:extensions/geode-modules-assembly:processResources UP-TO-DATE
:extensions/geode-modules-assembly:classes UP-TO-DATE
:geode-cq:compileJavaNote: <https://builds.apache.org/job/Geode-spark-connector/ws/geode-cq/src/main/java/com/gemstone/gemfire/cache/query/internal/cq/CqQueryImpl.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-cq:processResources
:geode-cq:classes
:geode-junit:compileJava
:geode-junit:processResources UP-TO-DATE
:geode-junit:classes
:geode-lucene:compileJavaNote: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-lucene:processResources
:geode-lucene:classes
:geode-pulse:compileJavaNote: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-pulse:copyGemFireVersionFile
:geode-pulse:processResources
:geode-pulse:classes
:geode-rebalancer:compileJava
:geode-rebalancer:processResources UP-TO-DATE
:geode-rebalancer:classes
:geode-wan:compileJavaNote: <https://builds.apache.org/job/Geode-spark-connector/ws/geode-wan/src/main/java/com/gemstone/gemfire/internal/cache/wan/GatewayReceiverImpl.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: <https://builds.apache.org/job/Geode-spark-connector/ws/geode-wan/src/main/java/com/gemstone/gemfire/cache/client/internal/GatewaySenderBatchOp.java> uses unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-wan:processResources
:geode-wan:classes
:geode-web:compileJava UP-TO-DATE
:geode-web:processResources UP-TO-DATE
:geode-web:classes UP-TO-DATE
:geode-web-api:compileJavaNote: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:geode-web-api:processResources UP-TO-DATE
:geode-web-api:classes
:geode-assembly:docs
:geode-assembly:gfshDepsJar
:geode-common:javadocJar
:geode-common:sourcesJar
:geode-common:signArchives SKIPPED
:geode-core:javadocJar
:geode-core:raJar
:geode-core:jcaJar
:geode-core:sourcesJar
:geode-core:signArchives SKIPPED
:geode-core:webJar
:geode-cq:jar
:geode-cq:javadoc
:geode-cq:javadocJar
:geode-cq:sourcesJar
:geode-cq:signArchives SKIPPED
:geode-joptsimple:javadocJar
:geode-joptsimple:sourcesJar
:geode-joptsimple:signArchives SKIPPED
:geode-json:javadocJar
:geode-json:sourcesJar
:geode-json:signArchives SKIPPED
:geode-lucene:jar
:geode-lucene:javadoc
:geode-lucene:javadocJar
:geode-lucene:sourcesJar
:geode-lucene:signArchives SKIPPED
:geode-pulse:javadoc
:geode-pulse:javadocJar
:geode-pulse:sourcesJar
:geode-pulse:war
:geode-pulse:signArchives SKIPPED
:geode-wan:jar
:geode-wan:javadoc
:geode-wan:javadocJar
:geode-wan:sourcesJar
:geode-wan:signArchives SKIPPED
:geode-web:javadoc UP-TO-DATE
:geode-web:javadocJar
:geode-web:sourcesJar
:geode-web:war
:geode-web:signArchives SKIPPED
:geode-web-api:jar
:geode-web-api:javadoc
:geode-web-api:javadocJar
:geode-web-api:sourcesJar
:geode-web-api:war
:geode-web-api:signArchives SKIPPED
:geode-assembly:installDist
BUILD SUCCESSFUL
Total time: 3 mins 22.716 secs
Build step 'Invoke Gradle script' changed build result to SUCCESS
[Geode-spark-connector] $ /bin/bash -xe /tmp/hudson5044730421889903876.sh
+ cd geode-spark-connector
+ ./sbt clean package test it:test
[0m[[0minfo[0m] [0mLoading project definition from <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/project[0m>
[0m[[0minfo[0m] [0mSet current project to Geode Connector for Apache Spark (in build <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/)[0m>
Could not create file <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/target/streams/$global/ivyConfiguration/$global/streams/outjava.io.IOException>: No such file or directory
at sbt.ErrorHandling$.translate(ErrorHandling.scala:10)
at sbt.IO$.touch(IO.scala:143)
at sbt.std.Streams$$anon$3$$anon$2.make(Streams.scala:129)
at sbt.std.Streams$$anon$3$$anon$2.text(Streams.scala:113)
at sbt.std.Streams$$anon$3$$anon$2.log(Streams.scala:124)
at sbt.std.TaskStreams$class.log(Streams.scala:56)
at sbt.std.Streams$$anon$3$$anon$2.log$lzycompute(Streams.scala:102)
at sbt.std.Streams$$anon$3$$anon$2.log(Streams.scala:102)
at sbt.Classpaths$$anonfun$mkIvyConfiguration$1.apply(Defaults.scala:1438)
at sbt.Classpaths$$anonfun$mkIvyConfiguration$1.apply(Defaults.scala:1437)
at scala.Function10$$anonfun$tupled$1.apply(Function10.scala:35)
at scala.Function10$$anonfun$tupled$1.apply(Function10.scala:34)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:235)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: No such file or directory
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.createNewFile(File.java:1012)
at sbt.IO$$anonfun$1.apply$mcZ$sp(IO.scala:143)
at sbt.IO$$anonfun$1.apply(IO.scala:143)
at sbt.IO$$anonfun$1.apply(IO.scala:143)
at sbt.ErrorHandling$.translate(ErrorHandling.scala:10)
at sbt.IO$.touch(IO.scala:143)
at sbt.std.Streams$$anon$3$$anon$2.make(Streams.scala:129)
at sbt.std.Streams$$anon$3$$anon$2.text(Streams.scala:113)
at sbt.std.Streams$$anon$3$$anon$2.log(Streams.scala:124)
at sbt.std.TaskStreams$class.log(Streams.scala:56)
at sbt.std.Streams$$anon$3$$anon$2.log$lzycompute(Streams.scala:102)
at sbt.std.Streams$$anon$3$$anon$2.log(Streams.scala:102)
at sbt.Classpaths$$anonfun$mkIvyConfiguration$1.apply(Defaults.scala:1438)
at sbt.Classpaths$$anonfun$mkIvyConfiguration$1.apply(Defaults.scala:1437)
at scala.Function10$$anonfun$tupled$1.apply(Function10.scala:35)
at scala.Function10$$anonfun$tupled$1.apply(Function10.scala:34)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:235)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[0m[[31merror[0m] [0m(root/*:[31mivyConfiguration[0m) Could not create file <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/target/streams/$global/ivyConfiguration/$global/streams/outjava.io.IOException>: No such file or directory[0m
[0m[[31merror[0m] [0mTotal time: 0 s, completed Jun 11, 2016 1:25:46 PM[0m
Build step 'Execute shell' marked build as failure
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Skipped archiving because build is not successful