You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@geode.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/06/02 13:24:50 UTC

Build failed in Jenkins: Geode-spark-connector #23

See <https://builds.apache.org/job/Geode-spark-connector/23/changes>

Changes:

[jdeppe] GEODE-1455: Add SecurityTest JUnit category to outstanding gfsh / JMX

[jdeppe] GEODE-1454: Have "region" attribute, in JSONAuthorization json file be

[huynhja] GEODE-1316: Changing @since tags to @GemFire or @Geode

[eshu] GEODE-1400: An inflight transaction op could arrive later than a client

[upthewaterspout] GEODE-11: Adding a tool to dump the lucene index files

[upthewaterspout] GEODE-11 Adding stats for the lucene file system.

[upthewaterspout] GEODE-11 - Fixing failure in FileSystemJUnitTest

[huynhja] GEODE-11: Resolved compile warning for LuceneQueriesIntegrationTest

[upthewaterspout] GEODE-11: Adding a method to LuceneIndexImpl to dump indexes

[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public

[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public

[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public

[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public

[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public

[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public

[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public

[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public

[ukohlmeyer] GEODE-1377: Initial move of system properties from private to public

[ukohlmeyer] GEODE-1377: Fixed Test

[ukohlmeyer] GEODE-1377: Renaming SystemConfigurationProperties to

[ukohlmeyer] GEODE-1377: Fixed missing import

------------------------------------------
[...truncated 1563 lines...]

M[info] Resolving net.sf.py4j#py4j;0.8.2.1 ...

M[info] Resolving org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...

M[info] Resolving org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...

M[info] Resolving org.apache.hadoop#hadoop-yarn-server;2.2.0 ...

M[info] Resolving org.apache.spark#spark-sql_2.10;1.3.0 ...

M[info] Resolving org.apache.spark#spark-catalyst_2.10;1.3.0 ...

M[info] Resolving org.scala-lang#scala-compiler;2.10.4 ...

M[info] Resolving org.scalamacros#quasiquotes_2.10;2.0.1 ...

M[info] Resolving com.twitter#parquet-column;1.6.0rc3 ...

M[info] Resolving com.twitter#parquet-common;1.6.0rc3 ...

M[info] Resolving com.twitter#parquet-encoding;1.6.0rc3 ...

M[info] Resolving com.twitter#parquet-generator;1.6.0rc3 ...

M[info] Resolving commons-codec#commons-codec;1.5 ...

M[info] Resolving com.twitter#parquet-hadoop;1.6.0rc3 ...

M[info] Resolving com.twitter#parquet-format;2.2.0-rc1 ...

M[info] Resolving com.twitter#parquet-jackson;1.6.0rc3 ...

M[info] Resolving org.codehaus.jackson#jackson-mapper-asl;1.9.11 ...

M[info] Resolving org.codehaus.jackson#jackson-core-asl;1.9.11 ...

M[info] Resolving org.jodd#jodd-core;3.6.3 ...

M[info] Resolving commons-net#commons-net;3.1 ...
[warn] there were 7 feature warning(s); re-run with -feature for details
[warn] one warning found
[info] Resolving org.scoverage#scalac-scoverage-runtime_2.10;1.0.4 ...

M[info] Resolving org.scoverage#scalac-scoverage-plugin_2.10;1.0.4 ...
[warn] Note: Some input files use unchecked or unsafe operations.
[warn] Note: Recompile with -Xlint:unchecked for details.
[info] Packaging <https://builds.apache.org/job/Geode-spark-connector/23/artifact/geode-spark-connector/geode-spark-connector/target/scala-2.10/geode-spark-connector_2.10-0.5.0.jar> ...
[info] Done packaging.
[info] Resolving org.scala-lang#jline;2.10.4 ...

M[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source and 5 Java sources to <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/geode-spark-demos/basic-demos/target/scala-2.10/classes...>
[info] Packaging <https://builds.apache.org/job/Geode-spark-connector/23/artifact/geode-spark-connector/geode-spark-demos/basic-demos/target/scala-2.10/basic-demos_2.10-0.5.0.jar> ...
[info] Done packaging.
[success] Total time: 52 s, completed Jun 2, 2016 1:24:32 PM
[info] Compiling 11 Scala sources and 1 Java source to <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/geode-spark-connector/target/scala-2.10/test-classes...>
[warn] there were 2 feature warning(s); re-run with -feature for details
[warn] one warning found
[info] GeodeConnectionConfTest:
[info] - apply(SparkConf) w/ GeodeLocator property and empty geodeProps
[info] - apply(SparkConf) w/ GeodeLocator property and geode properties
[info] - apply(SparkConf) w/o GeodeLocator property
[info] - apply(SparkConf) w/ invalid GeodeLocator property
[info] - apply(locatorStr, geodeProps) w/ valid locatorStr and non geodeProps
[info] - apply(locatorStr, geodeProps) w/ valid locatorStr and non-empty geodeProps
[info] - apply(locatorStr, geodeProps) w/ invalid locatorStr
[info] - constructor w/ empty (host,port) pairs
[info] - getConnection() normal
[info] - getConnection() failure
r.type=java.lang.String r=List(/obj_obj_region)
[info] QueryParserTest:
[info] - select * from /r1
[info] - select c2 from /r1
[info] - select key, value from /r1.entries
[info] - select c1, c2 from /r1 where col1 > 100 and col2 <= 120 or c3 = 2
[info] - select * from /r1/r2 where c1 >= 200
[info] - import io.pivotal select c1, c2, c3 from /r1/r2, /r3/r4 where c1 <= 15 and c2 = 100
[info] - SELECT distinct f1, f2 FROM /r1/r2 WHere f = 100
[info] - IMPORT io.pivotal.geode IMPORT com.mypackage SELECT key,value FROM /root/sub.entries WHERE status = 'active' ORDER BY id desc
[info] - select distinct p.ID, p.status from /region p where p.ID > 5 order by p.status
[info] - SELECT DISTINCT * FROM /QueryRegion1 r1,  /QueryRegion2 r2 WHERE r1.ID = r2.ID
[info] - SELECT id, "type", positions, status FROM /obj_obj_region WHERE status = 'active'
[info] - SELECT r.id, r."type", r.positions, r.status FROM /obj_obj_region r, r.positions.values f WHERE r.status = 'active' and f.secId = 'MSFT'
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/jenkins/.ivy2/cache/org.apache.logging.log4j/log4j-slf4j-impl/jars/log4j-slf4j-impl-2.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
[info 2016/06/02 13:24:44.830 UTC <pool-7-thread-12-ScalaTest-running-GeodeRegionRDDTest> tid=0xea] RDD id=0 region=test conn=, env=Map()

[info 2016/06/02 13:24:44.869 UTC <pool-7-thread-12-ScalaTest-running-GeodeRegionRDDTest> tid=0xea] RDD id=0 region=test conn=, env=Map(preferred.partitioner -> OnePartition)

[info 2016/06/02 13:24:44.894 UTC <pool-7-thread-12-ScalaTest-running-GeodeRegionRDDTest> tid=0xea] RDD id=0 region=test conn=, env=Map()

[info 2016/06/02 13:24:44.930 UTC <pool-7-thread-12-ScalaTest-running-GeodeRegionRDDTest> tid=0xea] RDD id=0 region=test conn=, env=Map()

[info] GeodeRegionRDDTest:
[info] - create GeodeRDD with non-existing region
[info] - getPartitions with non-existing region
[info] - getPartitions with replicated region and not preferred env
[info] - getPartitions with replicated region and preferred OnePartitionPartitioner
[info] - getPartitions with partitioned region and not preferred env
[info] - GeodeRDD.compute() method
[warn 2016/06/02 13:24:44.959 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDPartitionerTest> tid=0xea] Invalid preferred partitioner name dummy.

[info] GeodeRDDPartitionerTest:
[info] - default partitioned region partitioner
[info] - default replicated region partitioner
[info] - GeodeRDDPartitioner.apply method
[info] - OnePartitionPartitioner
[info] - ServerSplitsPartitioner.doPartitions(): n=1 & no empty bucket
[info] - ServerSplitsPartitioner.doPartitions(): n=1 & 1 empty bucket
[info] - ServerSplitsPartitioner.doPartitions(): n=1 & 2 empty bucket
[info] - ServerSplitsPartitioner.doPartitions(): n=1 & 5 empty bucket
[info] - ServerSplitsPartitioner.doPartitions(): n=1, 4 empty-bucket, non-continuous IDs
[info] - ServerSplitsPartitioner.doPartitions(): n=2, no empty buckets, 3 servers have 1, 2, and 3 buckets
[info] - ServerSplitsPartitioner.doPartitions(): n=3, no empty buckets, 4 servers have 0, 2, 3, and 4 buckets
[info] - ServerSplitsPartitioner.partitions(): metadata = None 
[info] - ServerSplitsPartitioner.partitions(): replicated region 
[info] - ServerSplitsPartitioner.partitions(): partitioned region w/o data 
[info] - ServerSplitsPartitioner.partitions(): partitioned region w/ some data 
host name: pietas.apache.org
canonical host name: pietas.apache.org
canonical host name 2: pietas.apache.org
[info] LocatorHelperTest:
[info] - locatorStr2HostPortPair hostname w/o domain
[info] - locatorStr2HostPortPair hostname w/ domain
[info] - locatorStr2HostPortPair w/ invalid host name
[info] - locatorStr2HostPortPair w/ valid port
[info] - locatorStr2HostPortPair w/ invalid port
[info] - parseLocatorsString with valid locator(s)
[info] - parseLocatorsString with invalid locator(s)
[info] - pickPreferredGeodeServers: shared servers and one gf-server per host
[info] - pickPreferredGeodeServers: shared servers, one gf-server per host, un-sorted list
[info] - pickPreferredGeodeServers: shared servers and two gf-server per host
[info] - pickPreferredGeodeServers: shared servers, two gf-server per host, un-sorted server list
[info] - pickPreferredGeodeServers: no shared servers and one gf-server per host
[info] - pickPreferredGeodeServers: no shared servers, one gf-server per host, and less gf-server
[info] - pickPreferredGeodeServers: ad-hoc
[info 2016/06/02 13:24:45.200 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDFunctionsTest> tid=0xea] Save RDD id=0 to region test

[info 2016/06/02 13:24:45.236 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDFunctionsTest> tid=0xea] Save RDD id=0 to region test

[info 2016/06/02 13:24:45.268 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDFunctionsTest> tid=0xea] Save RDD id=0 to region test

[info 2016/06/02 13:24:45.296 UTC <pool-7-thread-12-ScalaTest-running-GeodeRDDFunctionsTest> tid=0xea] Save RDD id=0 to region test

[info] GeodeRDDFunctionsTest:
[info] - test PairRDDFunction Implicit
[info] - test RDDFunction Implicit
[info] - test GeodePairRDDWriter
[info] - test GeodeNonPairRDDWriter
[info] - test PairRDDFunctions.saveToGeode
[info] - test PairRDDFunctions.saveToGeode w/ opConf
[info] - test RDDFunctions.saveToGeode
[info] - test RDDFunctions.saveToGeode w/ opConf
[info 2016/06/02 13:24:45.486 UTC <pool-7-thread-12> tid=0xea] Save DStream region=testregion conn=

[info 2016/06/02 13:24:45.660 UTC <pool-7-thread-12> tid=0xea] Save RDD id=0 to region testregion

[info 2016/06/02 13:24:45.687 UTC <pool-7-thread-12> tid=0xea] Save DStream region=testregion conn=

[info 2016/06/02 13:24:45.738 UTC <pool-7-thread-12> tid=0xea] Save RDD id=0 to region testregion

[info] JavaAPITest:
[info] - testSparkContextFunction
[info] - testJavaPairDStreamFunctions
[info] - testSQLContextFunction
[info] - testJavaSparkContextFunctions
[info] - testJavaRDDFunctions
[info] - testJavaDStreamFunctions
[info] - testJavaPairDStreamFunctionsWithTuple2DStream
[info] - testJavaPairRDDFunctions
[info] ConnectorImplicitsTest:
[info] - implicit map2Properties
[info] - Test Implicit SparkContext Conversion
[info] - Test Implicit SQLContext Conversion
[info 2016/06/02 13:24:45.796 UTC <pool-7-thread-12-ScalaTest-running-GeodeFunctionDeployerTest> tid=0xea] Invalid jar <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/file>:/somemissingjarfilethatdoesnot.exist

[info 2016/06/02 13:24:45.797 UTC <pool-7-thread-12-ScalaTest-running-GeodeFunctionDeployerTest> tid=0xea] Invalid jar <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/file>:/somemissingjarfilethatdoesnot.exist

[info 2016/06/02 13:24:45.798 UTC <pool-7-thread-12-ScalaTest-running-GeodeFunctionDeployerTest> tid=0xea] Invalid jar <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/file>:/somemissingjarfilethatdoesnot.exist

[info] GeodeFunctionDeployerTest:
[info] - jmx url creation
[info] - missing jar file
[info] - deploy with missing jar
[info] - successful mocked deploy
[info 2016/06/02 13:24:45.860 UTC <pool-7-thread-12-ScalaTest-running-GeodeDStreamFunctionsTest> tid=0xea] Save DStream region=test conn=

[info 2016/06/02 13:24:45.873 UTC <pool-7-thread-12-ScalaTest-running-GeodeDStreamFunctionsTest> tid=0xea] Save DStream region=test conn=

[info] GeodeDStreamFunctionsTest:
[info] - test GeodePairDStreamFunctions Implicit
[info] - test GeodeDStreamFunctions Implicit
[info] - test GeodePairDStreamFunctions.saveToGeode()
[info] - test GeodeDStreamFunctions.saveToGeode()
[info] DefaultGeodeConnectionManagerTest:
[info] - DefaultGeodeConnectionFactory get/closeConnection
[info] - DefaultGeodeConnectionFactory newConnection(...) throws RuntimeException
[info] - DefaultGeodeConnectionFactory close() w/ non-exist connection
[info 2016/06/02 13:24:45.998 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 10 rows, type=(java.lang.Integer, java.lang.String), type.size=159, data.size=151, row.avg.size=15.1

[info 2016/06/02 13:24:46.525 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 10 rows, type=(java.lang.Integer, java.lang.String), type.size=0, data.size=151, row.avg.size=15.1

[info 2016/06/02 13:24:46.528 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 0 rows, type=null, type.size=0, data.size=0, row.avg.size=NaN

[info 2016/06/02 13:24:46.554 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 10000 rows, type=(java.lang.Integer), type.size=131, data.size=60015, row.avg.size=6.0

[info 2016/06/02 13:24:46.610 UTC <ForkJoinPool-1-worker-29> tid=0xf8] sender1: 150 rows, type=(java.lang.Integer), type.size=131, data.size=901, row.avg.size=6.0

[info 2016/06/02 13:24:46.612 UTC <ForkJoinPool-1-worker-11> tid=0xf9] sender2: 150 rows, type=(java.lang.Integer), type.size=131, data.size=901, row.avg.size=6.0

[info 2016/06/02 13:24:46.627 UTC <ForkJoinPool-1-worker-11> tid=0xf9] sender1: 500 rows, type=(java.lang.Integer), type.size=131, data.size=3001, row.avg.size=6.0

[info 2016/06/02 13:24:46.677 UTC <pool-7-thread-12-ScalaTest-running-StructStreamingResultSenderAndCollectorTest> tid=0xea] StructStreamingResultSender: 1000 rows, type=(java.lang.String, java.lang.String), type.size=159, data.size=23792, row.avg.size=23.8

[info] StructStreamingResultSenderAndCollectorTest:
[info] - transfer simple data
[info] - transfer simple data with no type info
[info] - transfer data with 0 row
[info] - transfer data with 10K rows
[info] - transfer data with 10K rows with 2 sender
[info] - transfer data with 10K rows with 2 sender with error
[info] - transfer data with Exception
[info] - transfer string pair data with 200 rows
[info] - DataSerializer usage
[info] ScalaTest
[info] Run completed in 3 seconds, 843 milliseconds.
[info] Total number of tests run: 96
[info] Suites: completed 12, aborted 0
[info] Tests: succeeded 96, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 96, Failed 0, Errors 0, Passed 96
[success] Total time: 14 s, completed Jun 2, 2016 1:24:46 PM
[info] Compiling 9 Scala sources and 4 Java sources to <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/geode-spark-connector/target/scala-2.10/it-classes...>
[error] <https://builds.apache.org/job/Geode-spark-connector/ws/geode-spark-connector/geode-spark-connector/src/it/java/ittest/io/pivotal/geode/spark/connector/JavaApiIntegrationTest.java>:61: error: cannot find symbol
[error]     settings.setProperty(CACHE_XML_FILE, "src/it/resources/test-retrieve-regions.xml");
[error]                          ^
[error]   symbol:   variable CACHE_XML_FILE
[error]   location: class JavaApiIntegrationTest
[error] 1 error
[error] (geode-spark-connector/it:compile) javac returned nonzero exit code
[error] Total time: 8 s, completed Jun 2, 2016 1:24:54 PM
Build step 'Execute shell' marked build as failure
Recording test results
Skipped archiving because build is not successful

Jenkins build is back to normal : Geode-spark-connector #24

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Geode-spark-connector/24/changes>