You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/03/06 11:09:13 UTC

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #88

See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/88/display/redirect?page=changes>

Changes:

[ehudm] Don't cache pubsub subscription prematurely.

[ehudm] Add Python lint check for calls to unittest.main.

[github] Fixing formatting bug in filebasedsink.py.

[github] Fix lint issue.

[mariagh] Add TestClock to test

[daniel.o.programmer] [BEAM-3126] Fixing incorrect function call in bundle processor.

[samuel.waggoner] [BEAM-3777] allow UDAFs to be indirect subclasses of CombineFn

------------------------------------------
[...truncated 3.60 MB...]
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logError
SEVERE: Task 1 in stage 0.0 failed 1 times; aborting job
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool 
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Lost task 2.0 in stage 0.0 (TID 2) on localhost, executor driver: java.lang.NoSuchMethodError (org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;) [duplicate 1]
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool 
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Lost task 0.0 in stage 0.0 (TID 0) on localhost, executor driver: java.lang.NoSuchMethodError (org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;) [duplicate 2]
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool 
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Lost task 3.0 in stage 0.0 (TID 3) on localhost, executor driver: java.lang.NoSuchMethodError (org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;) [duplicate 3]
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool 
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cancelling stage 0
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 0 (mapToPair at GroupCombineFunctions.java:184) failed in 2.561 s due to Job aborted due to stage failure: Task 1 in stage 0.0 failed 1 times, most recent failure: Lost task 1.0 in stage 0.0 (TID 1, localhost, executor driver): java.lang.NoSuchMethodError: org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;
	at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call(MultiDoFnFunction.java:137)
	at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call(MultiDoFnFunction.java:58)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:186)
	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:186)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 0 failed: collect at BoundedDataset.java:87, took 2.914829 s
Mar 06, 2018 11:06:09 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@7e442deb{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[WARNING] 
org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.NoSuchMethodError: org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;
    at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom (SparkPipelineResult.java:68)
    at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish (SparkPipelineResult.java:99)
    at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish (SparkPipelineResult.java:87)
    at org.apache.beam.examples.WordCount.main (WordCount.java:187)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
    at java.lang.Thread.run (Thread.java:748)
Caused by: java.lang.NoSuchMethodError: org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;
    at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call (MultiDoFnFunction.java:137)
    at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call (MultiDoFnFunction.java:58)
    at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply (JavaRDDLike.scala:186)
    at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply (JavaRDDLike.scala:186)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply (RDD.scala:797)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply (RDD.scala:797)
    at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
    at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
    at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
    at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
    at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
    at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
    at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
    at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
    at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
    at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
    at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
    at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
    at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
    at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
    at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
    at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask (ShuffleMapTask.scala:96)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask (ShuffleMapTask.scala:53)
    at org.apache.spark.scheduler.Task.run (Task.scala:108)
    at org.apache.spark.executor.Executor$TaskRunner.run (Executor.scala:338)
    at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:45 min
[INFO] Finished at: 2018-03-06T11:06:09Z
[INFO] Final Memory: 89M/1207M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command
:runners:spark:runQuickstartJavaSpark FAILED
Mar 06, 2018 11:06:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:22.185Z: (d0918e729e906fcd): Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
Mar 06, 2018 11:06:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:22.249Z: (d0918e729e906395): Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
Mar 06, 2018 11:06:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:22.386Z: (d0918e729e906a90): Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Mar 06, 2018 11:06:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:35.942Z: (f72b707b5e07eff): Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
Mar 06, 2018 11:06:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:36.022Z: (d0918e729e9060f6): Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
Mar 06, 2018 11:06:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:38.115Z: (d0918e729e906394): Executing operation s12-u31
Mar 06, 2018 11:06:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:38.331Z: (f72b707b5e070aa): Executing operation WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/CreateDataflowView
Mar 06, 2018 11:06:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:38.505Z: (45f02a316e2e3a4a): Executing operation WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map+WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize
Mar 06, 2018 11:06:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:40.292Z: (6ef79b57b2e29c72): Cleaning up.
Mar 06, 2018 11:06:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:06:40.372Z: (6ef79b57b2e29b28): Stopping worker pool...
Mar 06, 2018 11:09:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:08:58.123Z: (9b13f9a2b6d6b331): Autoscaling: Resized worker pool from 1 to 0.
Mar 06, 2018 11:09:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-06T11:08:58.141Z: (9b13f9a2b6d6be77): Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
Mar 06, 2018 11:09:06 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-03-06_03_04_13-1941969681040247338 finished with status DONE.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 06:15 min
[INFO] Finished at: 2018-03-06T11:09:06Z
[INFO] Final Memory: 78M/1288M
[INFO] ------------------------------------------------------------------------
gsutil cat gs://temp-storage-for-release-validation-tests/quickstart/count* | grep Montague:
Montague: 47
Verified Montague: 47
gsutil rm gs://temp-storage-for-release-validation-tests/quickstart/count*
Removing gs://temp-storage-for-release-validation-tests/quickstart/counts-00000-of-00003...
/ [1 objects]                                                                   
Removing gs://temp-storage-for-release-validation-tests/quickstart/counts-00001-of-00003...
/ [2 objects]                                                                   
Removing gs://temp-storage-for-release-validation-tests/quickstart/counts-00002-of-00003...
/ [3 objects]                                                                   
Operation completed over 3 objects.                                              
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:runQuickstartJavaFlinkLocal'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:apex:runQuickstartJavaApex'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:runQuickstartJavaSpark'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 7m 14s
6 actionable tasks: 6 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user daniel.o.programmer@gmail.com
Not sending mail to unregistered user samuel.waggoner@healthsparq.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user mariagh@mariagh.svl.corp.google.com

Jenkins build is back to normal : beam_PostRelease_NightlySnapshot #89

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/89/display/redirect>