You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/03/07 19:17:15 UTC

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #108

See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/108/display/redirect?page=changes>

Changes:

[yifanzou] BEAM-3339 Mobile gaming automation for Java nightly snapshot

------------------------------------------
[...truncated 2.41 MB...]
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient.modifyAckDeadline(PubsubJsonClient.java:233)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubReader.nackBatch(PubsubUnboundedSource.java:661)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubCheckpoint.nackAll(PubsubUnboundedSource.java:355)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource.createReader(PubsubUnboundedSource.java:1157)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource.createReader(PubsubUnboundedSource.java:1096)
	at org.apache.beam.runners.spark.io.MicrobatchSource$ReaderLoader.call(MicrobatchSource.java:312)
	at org.apache.beam.runners.spark.io.MicrobatchSource$ReaderLoader.call(MicrobatchSource.java:299)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4904)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
	at org.apache.beam.runners.spark.io.MicrobatchSource.getOrCreateReader(MicrobatchSource.java:131)
	at org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:154)
	at org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:105)
	at org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:181)
	at org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:180)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:57)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:55)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$.updateRecordWithData(MapWithStateRDD.scala:55)
	at org.apache.spark.streaming.rdd.MapWithStateRDD.compute(MapWithStateRDD.scala:159)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1038)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:969)
	at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:760)
	at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: Removed broadcast_0_WATERMARKS on 127.0.0.1:41354 in memory (size: 72.0 B, free: 1804.7 MB)
Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_0_WATERMARKS stored as values in memory (estimated size 296.0 B, free 1804.0 MB)
Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_0_WATERMARKS in memory on 127.0.0.1:41354 (size: 296.0 B, free: 1804.7 MB)
Mar 07, 2018 7:17:08 PM org.apache.beam.runners.spark.util.GlobalWatermarkHolder writeRemoteWatermarkBlock
INFO: Put new watermark block: {0=SparkWatermarks{lowWatermark=-290308-12-21T19:59:05.225Z, highWatermark=2018-03-07T19:16:42.000Z, synchronizedProcessingTime=2018-03-07T19:16:31.500Z}}
Mar 07, 2018 7:17:08 PM org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener onBatchCompleted
INFO: Batch with timestamp: 1520450191500 has completed, watermarks have been updated.
Mar 07, 2018 7:17:08 PM org.apache.beam.sdk.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
WARNING: Request failed with code 400, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://pubsub.googleapis.com/v1/projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_6861378004759243261:modifyAckDeadline
Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 21 (DStream at SparkUnboundedSource.java:172) failed in 0.124 s due to Stage cancelled because SparkContext was shut down
Mar 07, 2018 7:17:08 PM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource createReader
SEVERE: Pubsub projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_6861378004759243261 cannot have 999 lost messages NACKed, ignoring: {}
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
{
  "code" : 400,
  "errors" : [ {
    "domain" : "global",
    "message" : "You have passed a subscription that does not belong to the given ack ID (resource=projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_6861378004759243261).",
    "reason" : "badRequest"
  } ],
  "message" : "You have passed a subscription that does not belong to the given ack ID (resource=projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_6861378004759243261).",
  "status" : "INVALID_ARGUMENT"
}
	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:321)
	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1065)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient.modifyAckDeadline(PubsubJsonClient.java:233)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubReader.nackBatch(PubsubUnboundedSource.java:661)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubCheckpoint.nackAll(PubsubUnboundedSource.java:355)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource.createReader(PubsubUnboundedSource.java:1157)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource.createReader(PubsubUnboundedSource.java:1096)
	at org.apache.beam.runners.spark.io.MicrobatchSource$ReaderLoader.call(MicrobatchSource.java:312)
	at org.apache.beam.runners.spark.io.MicrobatchSource$ReaderLoader.call(MicrobatchSource.java:299)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4904)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
	at org.apache.beam.runners.spark.io.MicrobatchSource.getOrCreateReader(MicrobatchSource.java:131)
	at org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:154)
	at org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:105)
	at org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:181)
	at org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:180)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:57)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:55)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$.updateRecordWithData(MapWithStateRDD.scala:55)
	at org.apache.spark.streaming.rdd.MapWithStateRDD.compute(MapWithStateRDD.scala:159)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1038)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:969)
	at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:760)
	at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Mar 07, 2018 7:17:08 PM org.apache.beam.sdk.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
WARNING: Request failed with code 400, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://pubsub.googleapis.com/v1/projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_6861378004759243261:modifyAckDeadline
Mar 07, 2018 7:17:08 PM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource createReader
SEVERE: Pubsub projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_6861378004759243261 cannot have 999 lost messages NACKed, ignoring: {}
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
{
  "code" : 400,
  "errors" : [ {
    "domain" : "global",
    "message" : "You have passed a subscription that does not belong to the given ack ID (resource=projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_6861378004759243261).",
    "reason" : "badRequest"
  } ],
  "message" : "You have passed a subscription that does not belong to the given ack ID (resource=projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_6861378004759243261).",
  "status" : "INVALID_ARGUMENT"
}
	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
	at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:321)
	at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1065)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
	at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient.modifyAckDeadline(PubsubJsonClient.java:233)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubReader.nackBatch(PubsubUnboundedSource.java:661)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubCheckpoint.nackAll(PubsubUnboundedSource.java:355)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource.createReader(PubsubUnboundedSource.java:1157)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource.createReader(PubsubUnboundedSource.java:1096)
	at org.apache.beam.runners.spark.io.MicrobatchSource$ReaderLoader.call(MicrobatchSource.java:312)
	at org.apache.beam.runners.spark.io.MicrobatchSource$ReaderLoader.call(MicrobatchSource.java:299)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4904)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
	at org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
	at org.apache.beam.runners.spark.io.MicrobatchSource.getOrCreateReader(MicrobatchSource.java:131)
	at org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:154)
	at org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:105)
	at org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:181)
	at org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:180)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:57)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:55)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$.updateRecordWithData(MapWithStateRDD.scala:55)
	at org.apache.spark.streaming.rdd.MapWithStateRDD.compute(MapWithStateRDD.scala:159)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1038)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:969)
	at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:760)
	at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to wait the pipeline until finish: org.apache.beam.runners.spark.SparkPipelineResult$StreamingMode@448f4d14 -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

***********************************************************
***********************************************************
*************************Tear Down*************************
The Pub/Sub topic has been deleted: projects/apache-beam-testing/topics/leaderboard-jenkins-0307191622-c69f6082
The Pub/Sub subscription has been deleted: projects/apache-beam-testing/subscriptions/leaderboard-jenkins-0307191622-c69f6082
***********************************************************
***********************************************************
[ERROR] Failed command
:runners:spark:runMobileGamingJavaSpark FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:spark:runMobileGamingJavaSpark'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 3m 44s
2 actionable tasks: 2 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user yifanzou@yifanzou-linuxworkstation.sea.corp.google.com

Jenkins build is back to normal : beam_PostRelease_NightlySnapshot #112

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/112/display/redirect>


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #111

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/111/display/redirect>

------------------------------------------
[...truncated 580.85 KB...]
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Started container container-6
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-5 msg: [container-5] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-1 msg: [container-1] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-2 msg: [container-2] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-3 msg: [container-3] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-4 msg: [container-4] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-6 msg: [container-6] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Started container container-7
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-7 msg: [container-7] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Started container container-8
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Started container container-9
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-8 msg: [container-8] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-9 msg: [container-9] Entering heartbeat loop..
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer heartbeatLoop
INFO: Waiting for pending request.
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer heartbeatLoop
INFO: Waiting for pending request.
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer heartbeatLoop
INFO: Waiting for pending request.
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=1,name=PubsubIO.Read/PubsubUnboundedSource/Read(PubsubSource),type=INPUT,checkpoint={ffffffffffffffff, 0, 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream9,bufferServer=localhost]]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=7,name=CalculateUserScores/ExtractUserScore/Combine.perKey(SumInteger)/GroupByKey,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream6,sourceNodeId=6,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream29,bufferServer=localhost]]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=19,name=CalculateTeamScores/LeaderboardTeamFixedWindows/Window.Assign,type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=inputPort,streamId=stream18,sourceNodeId=4,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=stream15,bufferServer=<null>]]], OperatorDeployInfo[id=6,name=CalculateUserScores/ExtractUserScore/MapElements/Map/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream23,sourceNodeId=5,sourcePortName=outputPort,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream6,bufferServer=localhost]]], OperatorDeployInfo[id=20,name=CalculateTeamScores/ExtractTeamScore/MapElements/Map/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream15,sourceNodeId=19,sourcePortName=outputPort,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream8,bufferServer=localhost]]], OperatorDeployInfo[id=2,name=PubsubIO.Read/PubsubUnboundedSource/PubsubUnboundedSource.Stats/ParMultiDo(Stats),type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream9,sourceNodeId=1,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream0,bufferServer=<null>]]], OperatorDeployInfo[id=4,name=ParseGameEvent/ParMultiDo(ParseEvent),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream5,sourceNodeId=3,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream18,bufferServer=<null>]]], OperatorDeployInfo[id=3,name=PubsubIO.Read/MapElements/Map/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream0,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream5,bufferServer=<null>]]], OperatorDeployInfo[id=5,name=CalculateUserScores/LeaderboardUserGlobalWindow/Window.Assign,type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=inputPort,streamId=stream18,sourceNodeId=4,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=stream23,bufferServer=<null>]]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=38,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/StreamingWrite,type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream10,sourceNodeId=37,sourcePortName=outputPort,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]], OperatorDeployInfo[id=33,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream24,sourceNodeId=18,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream32,bufferServer=<null>]]], OperatorDeployInfo[id=18,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream16,sourceNodeId=17,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream24,bufferServer=<null>]]], OperatorDeployInfo[id=37,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign,type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=inputPort,streamId=stream32,sourceNodeId=33,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=stream10,bufferServer=<null>]]], OperatorDeployInfo[id=17,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable/ParMultiDo(Anonymous),type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream7,sourceNodeId=16,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream16,bufferServer=<null>]]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=29,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream33,sourceNodeId=28,sourcePortName=outputPort,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream31,bufferServer=localhost]]], OperatorDeployInfo[id=27,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/TagWithUniqueIds/ParMultiDo(TagWithUniqueIds),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream3,sourceNodeId=26,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream21,bufferServer=<null>]]], OperatorDeployInfo[id=26,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/ShardTableWrites/ParMultiDo(GenerateShardedTable),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream35,sourceNodeId=25,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream3,bufferServer=<null>]]], OperatorDeployInfo[id=25,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/CreateTables/ParDo(CreateTables)/ParMultiDo(CreateTables),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream4,sourceNodeId=24,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream35,bufferServer=<null>]]], OperatorDeployInfo[id=24,name=WriteTeamScoreSums/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream19,sourceNodeId=23,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream4,bufferServer=<null>]]], OperatorDeployInfo[id=22,name=CalculateTeamScores/ExtractTeamScore/Combine.perKey(SumInteger)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous),type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream25,sourceNodeId=21,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream26,bufferServer=<null>]]], OperatorDeployInfo[id=28,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign,type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=inputPort,streamId=stream21,sourceNodeId=27,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=stream33,bufferServer=<null>]]], OperatorDeployInfo[id=23,name=WriteTeamScoreSums/ConvertToRow/ParMultiDo(BuildRow),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream26,sourceNodeId=22,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream19,bufferServer=<null>]]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=30,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream31,sourceNodeId=29,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream17,bufferServer=localhost]]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=21,name=CalculateTeamScores/ExtractTeamScore/Combine.perKey(SumInteger)/GroupByKey,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream8,sourceNodeId=20,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream25,bufferServer=localhost]]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.WindowGenerator activate
INFO: Catching up from 1520470851500 to 1520470853648
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, identifier=30.output.30, windowId=ffffffffffffffff}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, identifier=21.output.21, windowId=ffffffffffffffff}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, identifier=1.output.1, windowId=ffffffffffffffff}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://localhost:45403/29.output.29, windowId=ffffffffffffffff, type=stream31/30.input, upstreamIdentifier=29.output.29, mask=0, partitions=null, bufferSize=1024}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://localhost:45403/6.output.7, windowId=ffffffffffffffff, type=stream6/7.input, upstreamIdentifier=6.output.7, mask=0, partitions=null, bufferSize=1024}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, identifier=7.output.8, windowId=ffffffffffffffff}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://localhost:45403/20.output.20, windowId=ffffffffffffffff, type=stream8/21.input, upstreamIdentifier=20.output.20, mask=0, partitions=null, bufferSize=1024}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://localhost:45403/16.output.17, windowId=ffffffffffffffff, type=stream7/17.input, upstreamIdentifier=16.output.17, mask=0, partitions=null, bufferSize=1024}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, identifier=20.output.20, windowId=ffffffffffffffff}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, identifier=6.output.7, windowId=ffffffffffffffff}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://localhost:45403/1.output.1, windowId=ffffffffffffffff, type=stream9/2.input, upstreamIdentifier=1.output.1, mask=0, partitions=null, bufferSize=1024}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://localhost:45403/21.output.21, windowId=ffffffffffffffff, type=stream25/22.input, upstreamIdentifier=21.output.21, mask=0, partitions=null, bufferSize=1024}
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, identifier=29.output.29, windowId=ffffffffffffffff}
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=34,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream22,sourceNodeId=32,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream34,bufferServer=<null>]]], OperatorDeployInfo[id=32,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream11,sourceNodeId=31,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream22,bufferServer=<null>]]], OperatorDeployInfo[id=35,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign,type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=inputPort,streamId=stream34,sourceNodeId=34,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=stream12,bufferServer=<null>]]], OperatorDeployInfo[id=31,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable/ParMultiDo(Anonymous),type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream17,sourceNodeId=30,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream11,bufferServer=<null>]]], OperatorDeployInfo[id=36,name=WriteTeamScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/StreamingWrite,type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream12,sourceNodeId=35,sourcePortName=outputPort,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=16,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream30,sourceNodeId=15,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream7,bufferServer=localhost]]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer processHeartbeatResponse
INFO: Deploy request: [OperatorDeployInfo[id=9,name=WriteUserScoreSums/ConvertToRow/ParMultiDo(BuildRow),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream14,sourceNodeId=8,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream20,bufferServer=<null>]]], OperatorDeployInfo[id=13,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/TagWithUniqueIds/ParMultiDo(TagWithUniqueIds),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream2,sourceNodeId=12,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream1,bufferServer=<null>]]], OperatorDeployInfo[id=14,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign,type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=inputPort,streamId=stream1,sourceNodeId=13,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=stream28,bufferServer=<null>]]], OperatorDeployInfo[id=10,name=WriteUserScoreSums/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream20,sourceNodeId=9,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream27,bufferServer=<null>]]], OperatorDeployInfo[id=12,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/ShardTableWrites/ParMultiDo(GenerateShardedTable),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream13,sourceNodeId=11,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream2,bufferServer=<null>]]], OperatorDeployInfo[id=11,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/CreateTables/ParDo(CreateTables)/ParMultiDo(CreateTables),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream27,sourceNodeId=10,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream13,bufferServer=<null>]]], OperatorDeployInfo[id=8,name=CalculateUserScores/ExtractUserScore/Combine.perKey(SumInteger)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous),type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream29,sourceNodeId=7,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream14,bufferServer=<null>]]], OperatorDeployInfo[id=15,name=WriteUserScoreSums/BigQueryIO.Write/StreamingInserts/StreamingWriteTables/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous),type=OIO,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream28,sourceNodeId=14,sourcePortName=outputPort,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream30,bufferServer=localhost]]]]
Mar 08, 2018 1:00:53 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, identifier=16.output.17, windowId=ffffffffffffffff}
Mar 08, 2018 1:00:54 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://localhost:45403/15.output.16, windowId=ffffffffffffffff, type=stream30/16.input, upstreamIdentifier=15.output.16, mask=0, partitions=null, bufferSize=1024}
Mar 08, 2018 1:00:54 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://localhost:45403/30.output.30, windowId=ffffffffffffffff, type=stream17/31.input, upstreamIdentifier=30.output.30, mask=0, partitions=null, bufferSize=1024}
Mar 08, 2018 1:00:54 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://localhost:45403/7.output.8, windowId=ffffffffffffffff, type=stream29/8.input, upstreamIdentifier=7.output.8, mask=0, partitions=null, bufferSize=1024}
Mar 08, 2018 1:00:54 AM com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, identifier=15.output.16, windowId=ffffffffffffffff}
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:54 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:55 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:56 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
WARNING: Created subscription projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_-3999549547719927286 to topic projects/apache-beam-testing/topics/java_mobile_gaming_topic. Note this subscription WILL NOT be deleted when the pipeline terminates
Mar 08, 2018 1:00:58 AM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution environment.
...........................................................................................................................Introducing a parse error.
............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................DELAY(512685, 1)
{timestamp_ms=1520470634000}
late data for: user5_AuburnKangaroo,AuburnKangaroo,10,1520470634000,2018-03-07 17:05:47.019
Mar 08, 2018 1:05:59 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
INFO: Trying to create BigQuery table: apache-beam-testing:beam_postrelease_mobile_gaming.leaderboard_ApexRunner_team
.......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................DELAY(331029, 1)
{timestamp_ms=1520471116000}
late data for: Robot-19,AmethystCaneToad,16,1520471116000,2018-03-07 17:10:47.217
bq query SELECT table_id FROM beam_postrelease_mobile_gaming.__TABLES_SUMMARY__

Waiting on bqjob_r73edc787e3cec324_00000162032d8047_1 ... (0s) Current status: RUNNING
                                                                                      
Waiting on bqjob_r73edc787e3cec324_00000162032d8047_1 ... (0s) Current status: DONE   
+-----------------------------------+
|             table_id              |
+-----------------------------------+
| gamestats_DataflowRunner_sessions |
| gamestats_DataflowRunner_team     |
| leaderboard_ApexRunner_team       |
| leaderboard_DataflowRunner_team   |
| leaderboard_DataflowRunner_user   |
| leaderboard_DirectRunner_team     |
| leaderboard_DirectRunner_user     |
+-----------------------------------+
Cannot find leaderboard_ApexRunner_user in Waiting on bqjob_r73edc787e3cec324_00000162032d8047_1 ... (0s) Current status: RUNNING
                                                                                      
Waiting on bqjob_r73edc787e3cec324_00000162032d8047_1 ... (0s) Current status: DONE   
+-----------------------------------+
|             table_id              |
+-----------------------------------+
| gamestats_DataflowRunner_sessions |
| gamestats_DataflowRunner_team     |
| leaderboard_ApexRunner_team       |
| leaderboard_DataflowRunner_team   |
| leaderboard_DataflowRunner_user   |
| leaderboard_DirectRunner_team     |
| leaderboard_DirectRunner_user     |
+-----------------------------------+
[ERROR] Cannot find expected text
:runners:apex:runMobileGamingJavaApex FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:runMobileGamingJavaFlinkLocal'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:apex:runMobileGamingJavaApex'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 17m 42s
3 actionable tasks: 3 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user yifanzou@yifanzou-linuxworkstation.sea.corp.google.com


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #110

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/110/display/redirect>

------------------------------------------
[...truncated 620.87 KB...]
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (3s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (4s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (5s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (6s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (7s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (8s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (9s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (10s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (11s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (12s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (13s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (14s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (15s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (16s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (17s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (18s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (19s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (20s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (21s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (22s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (23s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (24s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (25s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (26s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (27s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (28s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (29s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (30s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (31s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (32s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (33s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (34s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (35s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (36s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (37s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (38s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (39s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (40s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (41s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (42s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (43s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (44s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (45s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (46s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (47s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (48s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (49s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (50s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (52s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (53s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (54s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (55s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (56s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (57s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (58s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (59s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (60s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (61s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (62s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (63s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (64s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (65s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (66s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (67s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (68s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (69s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (70s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (71s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (72s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (73s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (74s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (75s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (76s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (77s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (78s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (79s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (80s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (81s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (82s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (83s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (84s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (85s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (86s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (87s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (88s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (89s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (90s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (91s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (92s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (93s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (94s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (95s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (96s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (97s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (98s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (99s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (100s) Current status: PENDING
                                                                                        
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (101s) Current status: PENDING
                                                                                        
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (102s) Current status: PENDING
                                                                                        
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (103s) Current status: PENDING
                                                                                        
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (104s) Current status: PENDING
                                                                                        
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (105s) Current status: PENDING
                                                                                        
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (106s) Current status: PENDING
                                                                                        
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (107s) Current status: PENDING
                                                                                        
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (108s) Current status: PENDING
                                                                                        
Waiting on bqjob_r3711e10e729a3783_00000162030fb832_1 ... (108s) Current status: DONE   
+------------------------+
|          user          |
+------------------------+
| user6_ApricotDingo     |
| user11_BarnRedMarmot   |
| user1_RubyCaneToad     |
| user7_BisqueEmu        |
| user1_BarnRedMarmot    |
| Robot-5                |
| user7_BananaQuokka     |
| user6_AmethystMarmot   |
| user2_AmaranthCockatoo |
| user8_BarnRedMarmot    |
+------------------------+
Verified Amaranth

**************************************
* Test: SUCCEED: LeaderBoard successfully run on Apex.
**************************************

[SUCCESS]

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:flink:runMobileGamingJavaFlinkLocal'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 19m 27s
3 actionable tasks: 3 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user yifanzou@yifanzou-linuxworkstation.sea.corp.google.com

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #109

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/109/display/redirect>

------------------------------------------
[...truncated 2.55 MB...]
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1517)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1505)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1504)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1504)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1732)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1687)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1676)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2029)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2050)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2069)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2094)
	at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:936)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
	at org.apache.spark.rdd.RDD.collect(RDD.scala:935)
	at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
	at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
	at org.apache.beam.runners.spark.io.SparkUnboundedSource$ReadReportDStream.compute(SparkUnboundedSource.java:202)
	at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:342)
	at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:342)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
	at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:341)
	at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:341)
	at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
	at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:336)
	at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:334)
	at scala.Option.orElse(Option.scala:289)
	at org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:331)
	at org.apache.spark.streaming.dstream.DStream.generateJob(DStream.scala:432)
	at org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:122)
	at org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:121)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
	at org.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:121)
	at org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:249)
	at org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:247)
	at scala.util.Try$.apply(Try.scala:192)
	at org.apache.spark.streaming.scheduler.JobGenerator.generateJobs(JobGenerator.scala:247)
	at org.apache.spark.streaming.scheduler.JobGenerator.org$apache$spark$streaming$scheduler$JobGenerator$$processEvent(JobGenerator.scala:183)
	at org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:89)
	at org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:88)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Caused by: java.lang.NullPointerException
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubReader.ackBatch(PubsubUnboundedSource.java:651)
	at org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubCheckpoint.finalizeCheckpoint(PubsubUnboundedSource.java:313)
	at org.apache.beam.runners.spark.io.MicrobatchSource$Reader.finalizeCheckpoint(MicrobatchSource.java:261)
	at org.apache.beam.runners.spark.io.MicrobatchSource$Reader.advanceWithBackoff(MicrobatchSource.java:246)
	at org.apache.beam.runners.spark.io.MicrobatchSource$Reader.advance(MicrobatchSource.java:236)
	at org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:176)
	at org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:105)
	at org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:181)
	at org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:180)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:57)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:55)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.streaming.rdd.MapWithStateRDDRecord$.updateRecordWithData(MapWithStateRDD.scala:55)
	at org.apache.spark.streaming.rdd.MapWithStateRDD.compute(MapWithStateRDD.scala:159)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1038)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:969)
	at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:760)
	at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Executor killed task 7.0 in stage 23.0 (TID 65), reason: stage cancelled
Mar 07, 2018 10:33:58 PM org.apache.beam.runners.spark.io.SourceDStream computeReadMaxRecords
INFO: Max records per batch has not been limited by neither configuration nor the rate controller, and will remain unlimited for the current batch (9223372036854775807).
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped generation timer
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Lost task 7.0 in stage 23.0 (TID 65, localhost, executor driver): TaskKilled (stage cancelled)
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Waiting for jobs to be processed and checkpoints to be written
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Timed out while stopping the job generator (timeout = 5000)
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Waited for jobs to be processed and checkpoints to be written
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting job: DStream at SparkUnboundedSource.java:172
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: CheckpointWriter executor terminated? true, waited for 0 ms.
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped JobGenerator
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped JobScheduler
Mar 07, 2018 10:33:58 PM org.spark_project.jetty.server.handler.ContextHandler doStop
INFO: Stopped o.s.j.s.ServletContextHandler@4556dc5{/streaming,null,UNAVAILABLE,@Spark}
Mar 07, 2018 10:33:58 PM org.spark_project.jetty.server.handler.ContextHandler doStop
INFO: Stopped o.s.j.s.ServletContextHandler@6431ede4{/streaming/batch,null,UNAVAILABLE,@Spark}
Mar 07, 2018 10:33:58 PM org.spark_project.jetty.server.handler.ContextHandler doStop
INFO: Stopped o.s.j.s.ServletContextHandler@3ce4194e{/static/streaming,null,UNAVAILABLE,@Spark}
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: StreamingContext stopped successfully
Mar 07, 2018 10:33:58 PM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@4d9a0556{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Mar 07, 2018 10:33:58 PM org.apache.beam.runners.spark.stateful.StateSpecFunctions$1 apply
INFO: Source id 0_5 spent 1279 millis on reading.
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_27_5 stored as values in memory (estimated size 985.8 KB, free 1794.5 MB)
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_27_5 in memory on 127.0.0.1:36700 (size: 985.8 KB, free: 1794.6 MB)
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Putting block rdd_31_5 failed due to an exception
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Block rdd_31_5 could not be removed as it was not found on disk or in memory
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Putting block rdd_35_5 failed due to an exception
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Block rdd_35_5 could not be removed as it was not found on disk or in memory
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Executor killed task 5.0 in stage 23.0 (TID 63), reason: stage cancelled
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Lost task 5.0 in stage 23.0 (TID 63, localhost, executor driver): TaskKilled (stage cancelled)
Mar 07, 2018 10:33:58 PM org.apache.beam.runners.spark.stateful.StateSpecFunctions$1 apply
INFO: Source id 0_6 spent 1327 millis on reading.
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_19_6 stored as values in memory (estimated size 678.8 KB, free 1793.8 MB)
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_19_6 in memory on 127.0.0.1:36700 (size: 678.8 KB, free: 1793.9 MB)
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Putting block rdd_23_6 failed due to an exception
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Block rdd_23_6 could not be removed as it was not found on disk or in memory
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Putting block rdd_27_6 failed due to an exception
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Block rdd_27_6 could not be removed as it was not found on disk or in memory
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Putting block rdd_31_6 failed due to an exception
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Block rdd_31_6 could not be removed as it was not found on disk or in memory
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Putting block rdd_35_6 failed due to an exception
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Block rdd_35_6 could not be removed as it was not found on disk or in memory
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logInfo
INFO: Executor killed task 6.0 in stage 23.0 (TID 64), reason: stage cancelled
Mar 07, 2018 10:33:58 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Lost task 6.0 in stage 23.0 (TID 64, localhost, executor driver): TaskKilled (stage cancelled)
Mar 07, 2018 10:33:59 PM org.apache.beam.runners.spark.stateful.StateSpecFunctions$1 apply
INFO: Source id 0_3 spent 1259 millis on reading.
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_31_3 stored as values in memory (estimated size 1062.0 KB, free 1792.8 MB)
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_31_3 in memory on 127.0.0.1:36700 (size: 1062.0 KB, free: 1792.9 MB)
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Putting block rdd_35_3 failed due to an exception
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Block rdd_35_3 could not be removed as it was not found on disk or in memory
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: Executor killed task 3.0 in stage 23.0 (TID 61), reason: stage cancelled
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logWarning
WARNING: Lost task 3.0 in stage 23.0 (TID 61, localhost, executor driver): TaskKilled (stage cancelled)
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 23.0, whose tasks have all completed, from pool 
Mar 07, 2018 10:33:59 PM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
WARNING: Created subscription projects/apache-beam-testing/subscriptions/java_mobile_gaming_topic_beam_-5493834867768394301 to topic projects/apache-beam-testing/topics/java_mobile_gaming_topic. Note this subscription WILL NOT be deleted when the pipeline terminates
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: Got job 8 (DStream at SparkUnboundedSource.java:172) with 16 output partitions
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: Final stage: ResultStage 26 (DStream at SparkUnboundedSource.java:172)
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: Parents of final stage: List(ShuffleMapStage 24, ShuffleMapStage 25)
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Mar 07, 2018 10:33:59 PM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to wait the pipeline until finish: org.apache.beam.runners.spark.SparkPipelineResult$StreamingMode@6b9a692f -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

***********************************************************
***********************************************************
*************************Tear Down*************************
The Pub/Sub topic has been deleted: projects/apache-beam-testing/topics/leaderboard-jenkins-0307223259-1726341a
The Pub/Sub subscription has been deleted: projects/apache-beam-testing/subscriptions/leaderboard-jenkins-0307223259-1726341a
***********************************************************
***********************************************************
[ERROR] Failed command
:runners:spark:runMobileGamingJavaSpark FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:spark:runMobileGamingJavaSpark'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 4m 7s
2 actionable tasks: 2 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user yifanzou@yifanzou-linuxworkstation.sea.corp.google.com