You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@geode.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/06/02 13:24:50 UTC
Build failed in Jenkins: Geode-spark-connector #23
See <https://builds.apache.org/job/Geode-spark-connector/23/changes>
Changes:
[jdeppe] GEODE-1455: Add SecurityTest JUnit category to outstanding gfsh / JMX
[jdeppe] GEODE-1454: Have "region" attribute, in JSONAuthorization json file be
[huynhja] GEODE-1316: Changing @since tags to @GemFire or @Geode
[eshu] GEODE-1400: An inflight transaction op could arrive later than a client
[upthewaterspout] GEODE-11: Adding a tool to dump the lucene index files
[upthewaterspout] GEODE-11 Adding stats for the lucene file system.
[upthewaterspout] GEODE-11 - Fixing failure in FileSystemJUnitTest
[huynhja] GEODE-11: Resolved compile warning for LuceneQueriesIntegrationTest
[upthewaterspout] GEODE-11: Adding a method to LuceneIndexImpl to dump indexes
[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public
[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public
[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public
[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public
[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public
[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public
[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public
[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public
[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public
[ukohlmeyer] GEODE-1377: Fixed Test
[ukohlmeyer] GEODE-1377: Renaming SystemConfigurationProperties to
[ukohlmeyer] GEODE-1377: Fixed missing import
------------------------------------------
[...truncated 1563 lines...]
M[2K[0m[[0minfo[0m] [0mResolving net.sf.py4j#py4j;0.8.2.1 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.apache.hadoop#hadoop-yarn-server;2.2.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.apache.spark#spark-sql_2.10;1.3.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.apache.spark#spark-catalyst_2.10;1.3.0 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.scala-lang#scala-compiler;2.10.4 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.scalamacros#quasiquotes_2.10;2.0.1 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-column;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-common;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-encoding;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-generator;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving commons-codec#commons-codec;1.5 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-hadoop;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-format;2.2.0-rc1 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving com.twitter#parquet-jackson;1.6.0rc3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.codehaus.jackson#jackson-mapper-asl;1.9.11 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.codehaus.jackson#jackson-core-asl;1.9.11 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.jodd#jodd-core;3.6.3 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving commons-net#commons-net;3.1 ...[0m
[0m[[33mwarn[0m] [0mthere were 7 feature warning(s); re-run with -feature for details[0m
[0m[[33mwarn[0m] [0mone warning found[0m
[0m[[0minfo[0m] [0mResolving org.scoverage#scalac-scoverage-runtime_2.10;1.0.4 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.scoverage#scalac-scoverage-plugin_2.10;1.0.4 ...[0m
[0m[[33mwarn[0m] [0mNote: Some input files use unchecked or unsafe operations.[0m
[0m[[33mwarn[0m] [0mNote: Recompile with -Xlint:unchecked for details.[0m
[0m[[0minfo[0m] [0mPackaging <https://builds.apache.org/job/Geode-spark-connector/23/artifact/geode-spark-connector/geode-spark-connector/target/scala-2.10/geode-spark-connector_2.10-0.5.0.jar> ...[0m
[0m[[0minfo[0m] [0mDone packaging.[0m
[0m[[0minfo[0m] [0mResolving org.scala-lang#jline;2.10.4 ...[0m
M[2K[0m[[0minfo[0m] [0mResolving org.fusesource.jansi#jansi;1.4 ...[0m
[0m[[0minfo[0m] [0mDone updating.[0m
[0m[[0minfo[0m] [0mCompiling 1 Scala source and 5 Java sources to <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/geode-spark-demos/basic-demos/target/scala-2.10/classes...[0m>
[0m[[0minfo[0m] [0mPackaging <https://builds.apache.org/job/Geode-spark-connector/23/artifact/geode-spark-connector/geode-spark-demos/basic-demos/target/scala-2.10/basic-demos_2.10-0.5.0.jar> ...[0m
[0m[[0minfo[0m] [0mDone packaging.[0m
[0m[[32msuccess[0m] [0mTotal time: 52 s, completed Jun 2, 2016 1:24:32 PM[0m
[0m[[0minfo[0m] [0mCompiling 11 Scala sources and 1 Java source to <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/geode-spark-connector/target/scala-2.10/test-classes...[0m>
[0m[[33mwarn[0m] [0mthere were 2 feature warning(s); re-run with -feature for details[0m
[0m[[33mwarn[0m] [0mone warning found[0m
[0m[[0minfo[0m] [0m[32mGeodeConnectionConfTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(SparkConf) w/ GeodeLocator property and empty geodeProps[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(SparkConf) w/ GeodeLocator property and geode properties[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(SparkConf) w/o GeodeLocator property[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(SparkConf) w/ invalid GeodeLocator property[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(locatorStr, geodeProps) w/ valid locatorStr and non geodeProps[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(locatorStr, geodeProps) w/ valid locatorStr and non-empty geodeProps[0m[0m
[0m[[0minfo[0m] [0m[32m- apply(locatorStr, geodeProps) w/ invalid locatorStr[0m[0m
[0m[[0minfo[0m] [0m[32m- constructor w/ empty (host,port) pairs[0m[0m
[0m[[0minfo[0m] [0m[32m- getConnection() normal[0m[0m
[0m[[0minfo[0m] [0m[32m- getConnection() failure[0m[0m
r.type=java.lang.String r=List(/obj_obj_region)
[0m[[0minfo[0m] [0m[32mQueryParserTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- select * from /r1[0m[0m
[0m[[0minfo[0m] [0m[32m- select c2 from /r1[0m[0m
[0m[[0minfo[0m] [0m[32m- select key, value from /r1.entries[0m[0m
[0m[[0minfo[0m] [0m[32m- select c1, c2 from /r1 where col1 > 100 and col2 <= 120 or c3 = 2[0m[0m
[0m[[0minfo[0m] [0m[32m- select * from /r1/r2 where c1 >= 200[0m[0m
[0m[[0minfo[0m] [0m[32m- import io.pivotal select c1, c2, c3 from /r1/r2, /r3/r4 where c1 <= 15 and c2 = 100[0m[0m
[0m[[0minfo[0m] [0m[32m- SELECT distinct f1, f2 FROM /r1/r2 WHere f = 100[0m[0m
[0m[[0minfo[0m] [0m[32m- IMPORT io.pivotal.geode IMPORT com.mypackage SELECT key,value FROM /root/sub.entries WHERE status = 'active' ORDER BY id desc[0m[0m
[0m[[0minfo[0m] [0m[32m- select distinct p.ID, p.status from /region p where p.ID > 5 order by p.status[0m[0m
[0m[[0minfo[0m] [0m[32m- SELECT DISTINCT * FROM /QueryRegion1 r1, /QueryRegion2 r2 WHERE r1.ID = r2.ID[0m[0m
[0m[[0minfo[0m] [0m[32m- SELECT id, "type", positions, status FROM /obj_obj_region WHERE status = 'active'[0m[0m
[0m[[0minfo[0m] [0m[32m- SELECT r.id, r."type", r.positions, r.status FROM /obj_obj_region r, r.positions.values f WHERE r.status = 'active' and f.secId = 'MSFT'[0m[0m
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/jenkins/.ivy2/cache/org.apache.logging.log4j/log4j-slf4j-impl/jars/log4j-slf4j-impl-2.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
[info 2016/06/02 13:24:44.830 UTC <pool-7-thread-12-ScalaTest-running-GeodeRegionRDDTest> tid=0xea] RDD id=0 region=test conn=, env=Map()
[info 2016/06/02 13:24:44.869 UTC <pool-7-thread-12-ScalaTest-running-GeodeRegionRDDTest> tid=0xea] RDD id=0 region=test conn=, env=Map(preferred.partitioner -> OnePartition)
[info 2016/06/02 13:24:44.894 UTC <pool-7-thread-12-ScalaTest-running-GeodeRegionRDDTest> tid=0xea] RDD id=0 region=test conn=, env=Map()
[info 2016/06/02 13:24:44.930 UTC <pool-7-thread-12-ScalaTest-running-GeodeRegionRDDTest> tid=0xea] RDD id=0 region=test conn=, env=Map()
[0m[[0minfo[0m] [0m[32mGeodeRegionRDDTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- create GeodeRDD with non-existing region[0m[0m
[0m[[0minfo[0m] [0m[32m- getPartitions with non-existing region[0m[0m
[0m[[0minfo[0m] [0m[32m- getPartitions with replicated region and not preferred env[0m[0m
[0m[[0minfo[0m] [0m[32m- getPartitions with replicated region and preferred OnePartitionPartitioner[0m[0m
[0m[[0minfo[0m] [0m[32m- getPartitions with partitioned region and not preferred env[0m[0m
[0m[[0minfo[0m] [0m[32m- GeodeRDD.compute() method[0m[0m
[warn 2016/06/02 13:24:44.959 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDPartitionerTest> tid=0xea] Invalid preferred partitioner name dummy.
[0m[[0minfo[0m] [0m[32mGeodeRDDPartitionerTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- default partitioned region partitioner[0m[0m
[0m[[0minfo[0m] [0m[32m- default replicated region partitioner[0m[0m
[0m[[0minfo[0m] [0m[32m- GeodeRDDPartitioner.apply method[0m[0m
[0m[[0minfo[0m] [0m[32m- OnePartitionPartitioner[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1 & no empty bucket[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1 & 1 empty bucket[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1 & 2 empty bucket[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1 & 5 empty bucket[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=1, 4 empty-bucket, non-continuous IDs[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=2, no empty buckets, 3 servers have 1, 2, and 3 buckets[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.doPartitions(): n=3, no empty buckets, 4 servers have 0, 2, 3, and 4 buckets[0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.partitions(): metadata = None [0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.partitions(): replicated region [0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.partitions(): partitioned region w/o data [0m[0m
[0m[[0minfo[0m] [0m[32m- ServerSplitsPartitioner.partitions(): partitioned region w/ some data [0m[0m
host name: pietas.apache.org
canonical host name: pietas.apache.org
canonical host name 2: pietas.apache.org
[0m[[0minfo[0m] [0m[32mLocatorHelperTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair hostname w/o domain[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair hostname w/ domain[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair w/ invalid host name[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair w/ valid port[0m[0m
[0m[[0minfo[0m] [0m[32m- locatorStr2HostPortPair w/ invalid port[0m[0m
[0m[[0minfo[0m] [0m[32m- parseLocatorsString with valid locator(s)[0m[0m
[0m[[0minfo[0m] [0m[32m- parseLocatorsString with invalid locator(s)[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: shared servers and one gf-server per host[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: shared servers, one gf-server per host, un-sorted list[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: shared servers and two gf-server per host[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: shared servers, two gf-server per host, un-sorted server list[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: no shared servers and one gf-server per host[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: no shared servers, one gf-server per host, and less gf-server[0m[0m
[0m[[0minfo[0m] [0m[32m- pickPreferredGeodeServers: ad-hoc[0m[0m
[info 2016/06/02 13:24:45.200 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDFunctionsTest> tid=0xea] Save RDD id=0 to region test
[info 2016/06/02 13:24:45.236 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDFunctionsTest> tid=0xea] Save RDD id=0 to region test
[info 2016/06/02 13:24:45.268 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDFunctionsTest> tid=0xea] Save RDD id=0 to region test
[info 2016/06/02 13:24:45.296 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDFunctionsTest> tid=0xea] Save RDD id=0 to region test
[0m[[0minfo[0m] [0m[32mGeodeRDDFunctionsTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- test PairRDDFunction Implicit[0m[0m
[0m[[0minfo[0m] [0m[32m- test RDDFunction Implicit[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodePairRDDWriter[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodeNonPairRDDWriter[0m[0m
[0m[[0minfo[0m] [0m[32m- test PairRDDFunctions.saveToGeode[0m[0m
[0m[[0minfo[0m] [0m[32m- test PairRDDFunctions.saveToGeode w/ opConf[0m[0m
[0m[[0minfo[0m] [0m[32m- test RDDFunctions.saveToGeode[0m[0m
[0m[[0minfo[0m] [0m[32m- test RDDFunctions.saveToGeode w/ opConf[0m[0m
[info 2016/06/02 13:24:45.486 UTC <pool-7-thread-12> tid=0xea] Save DStream region=testregion conn=
[info 2016/06/02 13:24:45.660 UTC <pool-7-thread-12> tid=0xea] Save RDD id=0 to region testregion
[info 2016/06/02 13:24:45.687 UTC <pool-7-thread-12> tid=0xea] Save DStream region=testregion conn=
[info 2016/06/02 13:24:45.738 UTC <pool-7-thread-12> tid=0xea] Save RDD id=0 to region testregion
[0m[[0minfo[0m] [0m[32mJavaAPITest:[0m[0m
[0m[[0minfo[0m] [0m[32m- testSparkContextFunction[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaPairDStreamFunctions[0m[0m
[0m[[0minfo[0m] [0m[32m- testSQLContextFunction[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaSparkContextFunctions[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaRDDFunctions[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaDStreamFunctions[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaPairDStreamFunctionsWithTuple2DStream[0m[0m
[0m[[0minfo[0m] [0m[32m- testJavaPairRDDFunctions[0m[0m
[0m[[0minfo[0m] [0m[32mConnectorImplicitsTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- implicit map2Properties[0m[0m
[0m[[0minfo[0m] [0m[32m- Test Implicit SparkContext Conversion[0m[0m
[0m[[0minfo[0m] [0m[32m- Test Implicit SQLContext Conversion[0m[0m
[info 2016/06/02 13:24:45.796 UTC <pool-7-thread-12-ScalaTest-running-GeodeFunctionDeployerTest> tid=0xea] Invalid jar <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/file>:/somemissingjarfilethatdoesnot.exist
[info 2016/06/02 13:24:45.797 UTC <pool-7-thread-12-ScalaTest-running-GeodeFunctionDeployerTest> tid=0xea] Invalid jar <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/file>:/somemissingjarfilethatdoesnot.exist
[info 2016/06/02 13:24:45.798 UTC <pool-7-thread-12-ScalaTest-running-GeodeFunctionDeployerTest> tid=0xea] Invalid jar <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/file>:/somemissingjarfilethatdoesnot.exist
[0m[[0minfo[0m] [0m[32mGeodeFunctionDeployerTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- jmx url creation[0m[0m
[0m[[0minfo[0m] [0m[32m- missing jar file[0m[0m
[0m[[0minfo[0m] [0m[32m- deploy with missing jar[0m[0m
[0m[[0minfo[0m] [0m[32m- successful mocked deploy[0m[0m
[info 2016/06/02 13:24:45.860 UTC <pool-7-thread-12-ScalaTest-running-GeodeDStreamFunctionsTest> tid=0xea] Save DStream region=test conn=
[info 2016/06/02 13:24:45.873 UTC <pool-7-thread-12-ScalaTest-running-GeodeDStreamFunctionsTest> tid=0xea] Save DStream region=test conn=
[0m[[0minfo[0m] [0m[32mGeodeDStreamFunctionsTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodePairDStreamFunctions Implicit[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodeDStreamFunctions Implicit[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodePairDStreamFunctions.saveToGeode()[0m[0m
[0m[[0minfo[0m] [0m[32m- test GeodeDStreamFunctions.saveToGeode()[0m[0m
[0m[[0minfo[0m] [0m[32mDefaultGeodeConnectionManagerTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- DefaultGeodeConnectionFactory get/closeConnection[0m[0m
[0m[[0minfo[0m] [0m[32m- DefaultGeodeConnectionFactory newConnection(...) throws RuntimeException[0m[0m
[0m[[0minfo[0m] [0m[32m- DefaultGeodeConnectionFactory close() w/ non-exist connection[0m[0m
[info 2016/06/02 13:24:45.998 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 10 rows, type=(java.lang.Integer, java.lang.String), type.size=159, data.size=151, row.avg.size=15.1
[info 2016/06/02 13:24:46.525 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 10 rows, type=(java.lang.Integer, java.lang.String), type.size=0, data.size=151, row.avg.size=15.1
[info 2016/06/02 13:24:46.528 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 0 rows, type=null, type.size=0, data.size=0, row.avg.size=NaN
[info 2016/06/02 13:24:46.554 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 10000 rows, type=(java.lang.Integer), type.size=131, data.size=60015, row.avg.size=6.0
[info 2016/06/02 13:24:46.610 UTC <ForkJoinPool-1-worker-29> tid=0xf8] sender1: 150 rows, type=(java.lang.Integer), type.size=131, data.size=901, row.avg.size=6.0
[info 2016/06/02 13:24:46.612 UTC <ForkJoinPool-1-worker-11> tid=0xf9] sender2: 150 rows, type=(java.lang.Integer), type.size=131, data.size=901, row.avg.size=6.0
[info 2016/06/02 13:24:46.627 UTC <ForkJoinPool-1-worker-11> tid=0xf9] sender1: 500 rows, type=(java.lang.Integer), type.size=131, data.size=3001, row.avg.size=6.0
[info 2016/06/02 13:24:46.677 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 1000 rows, type=(java.lang.String, java.lang.String), type.size=159, data.size=23792, row.avg.size=23.8
[0m[[0minfo[0m] [0m[32mStructStreamingResultSenderAndCollectorTest:[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer simple data[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer simple data with no type info[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with 0 row[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with 10K rows[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with 10K rows with 2 sender[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with 10K rows with 2 sender with error[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer data with Exception[0m[0m
[0m[[0minfo[0m] [0m[32m- transfer string pair data with 200 rows[0m[0m
[0m[[0minfo[0m] [0m[32m- DataSerializer usage[0m[0m
[0m[[0minfo[0m] [0mScalaTest[0m
[0m[[0minfo[0m] [0m[36mRun completed in 3 seconds, 843 milliseconds.[0m[0m
[0m[[0minfo[0m] [0m[36mTotal number of tests run: 96[0m[0m
[0m[[0minfo[0m] [0m[36mSuites: completed 12, aborted 0[0m[0m
[0m[[0minfo[0m] [0m[36mTests: succeeded 96, failed 0, canceled 0, ignored 0, pending 0[0m[0m
[0m[[0minfo[0m] [0m[32mAll tests passed.[0m[0m
[0m[[0minfo[0m] [0mPassed: Total 96, Failed 0, Errors 0, Passed 96[0m
[0m[[32msuccess[0m] [0mTotal time: 14 s, completed Jun 2, 2016 1:24:46 PM[0m
[0m[[0minfo[0m] [0mCompiling 9 Scala sources and 4 Java sources to <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/geode-spark-connector/target/scala-2.10/it-classes...[0m>
[0m[[31merror[0m] [0m<https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/geode-spark-connector/src/it/java/ittest/io/pivotal/geode/spark/connector/JavaApiIntegrationTest.java>:61: error: cannot find symbol[0m
[0m[[31merror[0m] [0m settings.setProperty(CACHE_XML_FILE, "src/it/resources/test-retrieve-regions.xml");[0m
[0m[[31merror[0m] [0m ^[0m
[0m[[31merror[0m] [0m symbol: variable CACHE_XML_FILE[0m
[0m[[31merror[0m] [0m location: class JavaApiIntegrationTest[0m
[0m[[31merror[0m] [0m1 error[0m
[0m[[31merror[0m] [0m(geode-spark-connector/it:[31mcompile[0m) javac returned nonzero exit code[0m
[0m[[31merror[0m] [0mTotal time: 8 s, completed Jun 2, 2016 1:24:54 PM[0m
Build step 'Execute shell' marked build as failure
Recording test results
Skipped archiving because build is not successful
Jenkins build is back to normal : Geode-spark-connector #24
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Geode-spark-connector/24/changes>