You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/11/03 11:07:06 UTC

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1159

See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1159/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-3736] Construct empty global combine values on workers when

[Robert Bradshaw] lint

[Robert Bradshaw] Order stages according to data edges as well as must-follows.

[samuelw] [BEAM-11143] Ensure that AfterWatermarkStateMachine clears late trigger

[piotr.szuberski] [BEAM-11160] Fix HadoopFormatIOIT

[noreply] [BEAM-5939] - Deduplicate constants (#13142)

[noreply] [BEAM-11154] Check coder proto to avoid registering same coder under

[noreply] [BEAM-9444] Use GCP BOM to set package versions (#13075)

[noreply] [BEAM-11052] Memoize to_pcollection (#13066)

[noreply] [BEAM-9547] Auto-populate any unimplemented methods/properties with

[noreply] [BEAM-10892] Add Proto support to Kafka Table Provider (#12838)


------------------------------------------
[...truncated 3.77 MB...]
Nov 03, 2020 11:05:44 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-8284009269397768056-tmpdir/word-count-beam/.temp-beam-477e6c35-6811-4318-af52-728c2fb80cb4/c5419d22-8a97-451d-8792-906049129384
Nov 03, 2020 11:05:44 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-8284009269397768056-tmpdir/word-count-beam/.temp-beam-477e6c35-6811-4318-af52-728c2fb80cb4/b832a800-5f0b-445d-8f34-358e62cde412
Nov 03, 2020 11:05:44 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-8284009269397768056-tmpdir/word-count-beam/.temp-beam-477e6c35-6811-4318-af52-728c2fb80cb4/].
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 113 ms on localhost (executor driver) (4/4)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.126 s
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:431), which has no missing parents
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:36551 (size: 7.3 KB, free: 13.5 GB)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:431) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6496 bytes result sent to driver
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6496 bytes result sent to driver
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6496 bytes result sent to driver
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 16 ms on localhost (executor driver) (1/4)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 16 ms on localhost (executor driver) (2/4)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 17 ms on localhost (executor driver) (3/4)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 20 ms on localhost (executor driver) (4/4)
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:125) finished in 0.031 s
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:125, took 0.210779 s
Nov 03, 2020 11:05:44 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 03, 2020 11:05:44 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@3c31338e{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 03, 2020 11:05:44 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-1383260911692779435-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-1383260911692779435-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 03, 2020 11:06:48 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 03, 2020 11:06:48 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.

> Task :runners:direct-java:runQuickstartJavaDirect
Nov 03, 2020 11:06:56 AM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
INFO: Filepattern pom.xml matched 1 files with total size 16322
Nov 03, 2020 11:06:56 AM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern pom.xml into bundles of size 816 took 1 ms and produced 1 files and 20 bundles
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 29e58f78-0ddd-4009-99bc-c3bd6592f0f7 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@766b190a pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer e66b8348-4a70-42a7-9343-859deff29856 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@766b190a pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/e66b8348-4a70-42a7-9343-859deff29856
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/29e58f78-0ddd-4009-99bc-c3bd6592f0f7
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 40078fcf-95e8-430e-95b7-71ded4e24742 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@766b190a pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 9a8cc3b5-164d-4e5a-9ab0-a8f13b192c74 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@766b190a pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/40078fcf-95e8-430e-95b7-71ded4e24742
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/9a8cc3b5-164d-4e5a-9ab0-a8f13b192c74
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/e66b8348-4a70-42a7-9343-859deff29856, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@766b190a, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/counts-00001-of-00004
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/40078fcf-95e8-430e-95b7-71ded4e24742, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@766b190a, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/counts-00003-of-00004
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/9a8cc3b5-164d-4e5a-9ab0-a8f13b192c74, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@766b190a, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/counts-00002-of-00004
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/29e58f78-0ddd-4009-99bc-c3bd6592f0f7, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@766b190a, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/counts-00000-of-00004
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/9a8cc3b5-164d-4e5a-9ab0-a8f13b192c74
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/e66b8348-4a70-42a7-9343-859deff29856
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/40078fcf-95e8-430e-95b7-71ded4e24742
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/29e58f78-0ddd-4009-99bc-c3bd6592f0f7
Nov 03, 2020 11:07:01 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-8570956175421357458-tmpdir/word-count-beam/.temp-beam-aff28084-de72-42bf-b396-c5c1d51d7802/].
grep Foundation counts*
counts-00003-of-00004:Foundation: 1
Verified Foundation: 1

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

> Task :runners:direct-java:runQuickstartJavaDirect
[SUCCESS]

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:twister2:runQuickstartJavaTwister2
Nov 03, 2020 11:07:03 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 03, 2020 11:07:03 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 47s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/dxs6je7iwpoec

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostRelease_NightlySnapshot #1175

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1175/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1174

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1174/display/redirect>

Changes:


------------------------------------------
[...truncated 4.07 MB...]
Waiting on bqjob_r1d34545c038563e0_00000175bf3bf716_1 ... (0s) Current status: RUNNING
                                                                                      
Waiting on bqjob_r1d34545c038563e0_00000175bf3bf716_1 ... (0s) Current status: DONE   
+-----------------------------------+
|             table_id              |
+-----------------------------------+
| hourly_team_score_python_dataflow |
| hourly_team_score_python_direct   |
| leaderboard_DataflowRunner_team   |
| leaderboard_DirectRunner_team     |
| leaderboard_DirectRunner_user     |
+-----------------------------------+
Waiting for pipeline to produce more results...
bq query SELECT table_id FROM beam_postrelease_mobile_gaming.__TABLES_SUMMARY__

Waiting on bqjob_r672418f47b4df3eb_00000175bf3cf280_1 ... (0s) Current status: RUNNING
                                                                                      
Waiting on bqjob_r672418f47b4df3eb_00000175bf3cf280_1 ... (0s) Current status: DONE   
+-----------------------------------+
|             table_id              |
+-----------------------------------+
| hourly_team_score_python_dataflow |
| hourly_team_score_python_direct   |
| leaderboard_DataflowRunner_team   |
| leaderboard_DataflowRunner_user   |
| leaderboard_DirectRunner_team     |
| leaderboard_DirectRunner_user     |
+-----------------------------------+
bq query --batch "SELECT user FROM [apache-beam-testing:beam_postrelease_mobile_gaming.leaderboard_DataflowRunner_user] LIMIT 10"

Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (0s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (1s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (2s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (3s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (4s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (5s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (6s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (7s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (8s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (9s) Current status: PENDING
                                                                                      
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (10s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (11s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (12s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (13s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (14s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (15s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (16s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (17s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (18s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (19s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (20s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (21s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (22s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (23s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (24s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (25s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (26s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (27s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (28s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (29s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (30s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (31s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (32s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (33s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (34s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (35s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (36s) Current status: PENDING
                                                                                       
.........................................................................................................................................................................................................................................................................................................................................................................................................................Introducing a parse error.
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (37s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (38s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (39s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (40s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (41s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (42s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (43s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (44s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (45s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (46s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (47s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (48s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (49s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (51s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (52s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (53s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (54s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (55s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (56s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (57s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (58s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (59s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (60s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (61s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (62s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (63s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (64s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (65s) Current status: PENDING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (66s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (67s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (68s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (69s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (70s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (71s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (72s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (73s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (74s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (75s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (76s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (77s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (78s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (79s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (80s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (81s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (82s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (83s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (84s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (85s) Current status: RUNNING
                                                                                       
Waiting on bqjob_r3a028e264732c9b1_00000175bf3d0144_1 ... (85s) Current status: DONE   
+---------------------------+
|           user            |
+---------------------------+
| user0_ApricotCaneToad     |
| user18_BisqueKoala        |
| user11_AppleGreenPlatypus |
| user7_AmaranthWallaby     |
| user0_BeigeEmu            |
| user5_AmberKoala          |
| user9_BeigeQuokka         |
| user3_MagentaCaneToad     |
| user1_AsparagusNumbat     |
| user0_RubyPlatypus        |
+---------------------------+
Verified Magenta
gcloud dataflow jobs cancel $(gcloud dataflow jobs list | grep leaderboard-validation-1605230466697-474 | grep Running | cut -d' ' -f1)
Cancelled job [2020-11-12_17_21_26-3786844111740669615]

**************************************
* SUCCESS: LeaderBoard successfully run on DataflowRunner. with Streaming Engine
**************************************

[SUCCESS]

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:twister2:runQuickstartJavaTwister2'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 23s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/75l2hcrbn3bzi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1173

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1173/display/redirect>

Changes:


------------------------------------------
[...truncated 558.04 KB...]
| leaderboard_DataflowRunner_team   |
| leaderboard_DataflowRunner_user   |
| leaderboard_DirectRunner_team     |
+-----------------------------------+
Waiting for pipeline to produce more results...
Nov 12, 2020 11:59:47 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
INFO: Trying to create BigQuery table: apache-beam-testing:beam_postrelease_mobile_gaming.leaderboard_DirectRunner_user
bq query SELECT table_id FROM beam_postrelease_mobile_gaming.__TABLES_SUMMARY__

Waiting on bqjob_r72e9e3a8d33fea62_00000175bee7926e_1 ... (0s) Current status: RUNNING
                                                                                      
Waiting on bqjob_r72e9e3a8d33fea62_00000175bee7926e_1 ... (0s) Current status: DONE   
+-----------------------------------+
|             table_id              |
+-----------------------------------+
| hourly_team_score_python_dataflow |
| hourly_team_score_python_direct   |
| leaderboard_DataflowRunner_team   |
| leaderboard_DataflowRunner_user   |
| leaderboard_DirectRunner_team     |
| leaderboard_DirectRunner_user     |
+-----------------------------------+
bq query --batch "SELECT user FROM [apache-beam-testing:beam_postrelease_mobile_gaming.leaderboard_DirectRunner_user] LIMIT 10"

Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (0s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (1s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (2s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (3s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (4s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (5s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (6s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (7s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (8s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (9s) Current status: PENDING
                                                                                      
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (11s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (12s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (13s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (14s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (15s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (16s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (17s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (18s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (19s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (20s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (21s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (22s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (23s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (24s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (25s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (26s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (27s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (28s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (29s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (30s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (31s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (32s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (33s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (34s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (35s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (36s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (37s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (38s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (39s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (40s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (41s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (42s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (43s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (44s) Current status: PENDING
                                                                                       
Nov 13, 2020 12:01:39 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 220ms, P90: 216ms, P50: 120ms
Nov 13, 2020 12:01:39 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 220ms, P90: 216ms, P50: 200ms
Nov 13, 2020 12:01:39 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 200ms, P90: 196ms, P50: 120ms
Nov 13, 2020 12:01:39 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 220ms, P90: 216ms, P50: 140ms
Nov 13, 2020 12:01:39 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 280ms, P90: 276ms, P50: 220ms
Nov 13, 2020 12:01:39 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 280ms, P90: 276ms, P50: 220ms
Nov 13, 2020 12:01:40 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 340ms, P90: 336ms, P50: 200ms
Nov 13, 2020 12:01:40 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 3, P99: 359ms, P90: 354ms, P50: 190ms
Nov 13, 2020 12:01:40 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 360ms, P90: 356ms, P50: 200ms
Nov 13, 2020 12:01:40 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 380ms, P90: 376ms, P50: 260ms
Nov 13, 2020 12:01:40 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 400ms, P90: 396ms, P50: 240ms
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (45s) Current status: PENDING
                                                                                       
Nov 13, 2020 12:01:40 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 200ms, P90: 196ms, P50: 80ms
Nov 13, 2020 12:01:40 AM org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn logPercentiles
INFO: Total number of streaming insert requests: 2, P99: 240ms, P90: 236ms, P50: 100ms
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (46s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (47s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (48s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (49s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (50s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (51s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (52s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (53s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (54s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (55s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (56s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (57s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (58s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (59s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (60s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (61s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (62s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (63s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (64s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (65s) Current status: PENDING
                                                                                       
Waiting on bqjob_r54151ed77e720398_00000175bee7a1d4_1 ... (65s) Current status: DONE   
+---------------------------+
|           user            |
+---------------------------+
| user14_AmethystKookaburra |
| user6_AsparagusNumbat     |
| user5_BisqueKoala         |
| user9_AsparagusKangaroo   |
| user3_AmethystKookaburra  |
| user1_BeigeQuokka         |
| user12_AsparagusNumbat    |
| user4_BisqueKoala         |
| user12_FuchsiaNumbat      |
| user14_AsparagusNumbat    |
+---------------------------+
Verified Amethyst

**************************************
* SUCCESS: LeaderBoard successfully run on DirectRunner.
**************************************

[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:runQuickstartJavaSpark'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:1.10:runQuickstartJavaFlinkLocal'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 20m 22s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/ie77c5htqhvfc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1172

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1172/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-11074] Adjust release candidate email template to encourage

[Boyuan Zhang] Add sdf initiated checkpoint support to portable Flink.

[Kyle Weaver] [BEAM-10925] Create wrapper for user function definitions.

[noreply] [BEAM-9561] Run pandas doctests in parallel. (#13286)

[noreply] [BEAM-11074] build_release_candidate usability improvements. (#13290)


------------------------------------------
[...truncated 3.78 MB...]
INFO: Cleaned accumulator 25
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 32
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 23
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 34
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 54
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 71
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 86
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 20
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/.temp-beam-b3330cb1-b3d6-4d9b-911d-90956c81e484/2fd43a32-bd91-4f1f-8938-bcf53a4490ba, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@67a75e8, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/counts-00000-of-00004
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/.temp-beam-b3330cb1-b3d6-4d9b-911d-90956c81e484/23aa92fc-929f-4748-8725-d4064e869b36, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@67a75e8, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/counts-00001-of-00004
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/.temp-beam-b3330cb1-b3d6-4d9b-911d-90956c81e484/3b7aa285-d564-4276-934c-1759141c2b78, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@67a75e8, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/counts-00002-of-00004
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/.temp-beam-b3330cb1-b3d6-4d9b-911d-90956c81e484/c767adcf-6a88-4ccf-8e49-dc8eb7ffbd64, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@67a75e8, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/counts-00003-of-00004
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/.temp-beam-b3330cb1-b3d6-4d9b-911d-90956c81e484/2fd43a32-bd91-4f1f-8938-bcf53a4490ba
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/.temp-beam-b3330cb1-b3d6-4d9b-911d-90956c81e484/23aa92fc-929f-4748-8725-d4064e869b36
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/.temp-beam-b3330cb1-b3d6-4d9b-911d-90956c81e484/3b7aa285-d564-4276-934c-1759141c2b78
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/.temp-beam-b3330cb1-b3d6-4d9b-911d-90956c81e484/c767adcf-6a88-4ccf-8e49-dc8eb7ffbd64
Nov 12, 2020 11:05:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-2113933755489822076-tmpdir/word-count-beam/.temp-beam-b3330cb1-b3d6-4d9b-911d-90956c81e484/].
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 112 ms on localhost (executor driver) (4/4)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.121 s
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:41167 (size: 7.3 KB, free: 13.5 GB)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6496 bytes result sent to driver
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6496 bytes result sent to driver
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6496 bytes result sent to driver
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 15 ms on localhost (executor driver) (1/4)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 17 ms on localhost (executor driver) (2/4)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 16 ms on localhost (executor driver) (3/4)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 18 ms on localhost (executor driver) (4/4)
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.029 s
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.223724 s
Nov 12, 2020 11:05:46 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 12, 2020 11:05:46 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@6081a434{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 12, 2020 11:05:46 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-8199630579545554063-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-8199630579545554063-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 12, 2020 11:06:51 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 12, 2020 11:06:52 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.

> Task :runners:direct-java:runMobileGamingJavaDirect
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:direct-java:runMobileGamingJavaDirect FAILED

> Task :runners:twister2:runQuickstartJavaTwister2
Nov 12, 2020 11:07:04 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 12, 2020 11:07:04 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 10s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/4o3twkk2any4y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1171

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1171/display/redirect>

Changes:


------------------------------------------
[...truncated 3.77 MB...]
INFO: Started 0 remote fetches in 1 ms
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25). 5149 bytes result sent to driver
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27). 5149 bytes result sent to driver
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26). 5149 bytes result sent to driver
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25) in 47 ms on localhost (executor driver) (1/4)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26) in 47 ms on localhost (executor driver) (2/4)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27) in 47 ms on localhost (executor driver) (3/4)
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/.temp-beam-be58160a-1454-4bbc-89f0-3a271b373d56/483b02d0-7b82-482f-a3ac-926bbded16a5, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4cf622ee, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/counts-00000-of-00004
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/.temp-beam-be58160a-1454-4bbc-89f0-3a271b373d56/1b4762fd-a183-463e-b6b0-43f1bc659d4d, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4cf622ee, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/counts-00001-of-00004
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/.temp-beam-be58160a-1454-4bbc-89f0-3a271b373d56/2ffbe669-f29a-4235-931a-402f513415c6, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4cf622ee, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/counts-00002-of-00004
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/.temp-beam-be58160a-1454-4bbc-89f0-3a271b373d56/672f251c-7b3b-4285-ba76-12f16d6514e0, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4cf622ee, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/counts-00003-of-00004
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/.temp-beam-be58160a-1454-4bbc-89f0-3a271b373d56/2ffbe669-f29a-4235-931a-402f513415c6
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/.temp-beam-be58160a-1454-4bbc-89f0-3a271b373d56/483b02d0-7b82-482f-a3ac-926bbded16a5
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/.temp-beam-be58160a-1454-4bbc-89f0-3a271b373d56/672f251c-7b3b-4285-ba76-12f16d6514e0
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/.temp-beam-be58160a-1454-4bbc-89f0-3a271b373d56/1b4762fd-a183-463e-b6b0-43f1bc659d4d
Nov 12, 2020 2:37:42 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-5651750721862638029-tmpdir/word-count-beam/.temp-beam-be58160a-1454-4bbc-89f0-3a271b373d56/].
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 130 ms on localhost (executor driver) (4/4)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.145 s
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:33075 (size: 7.3 KB, free: 13.5 GB)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6453 bytes result sent to driver
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6453 bytes result sent to driver
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6453 bytes result sent to driver
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6453 bytes result sent to driver
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 18 ms on localhost (executor driver) (1/4)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 20 ms on localhost (executor driver) (2/4)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 18 ms on localhost (executor driver) (3/4)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 22 ms on localhost (executor driver) (4/4)
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.035 s
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.245401 s
Nov 12, 2020 2:37:42 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 12, 2020 2:37:42 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@3d8fae82{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 12, 2020 2:37:42 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-8536797268774615331-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-8536797268774615331-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 12, 2020 2:38:49 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 12, 2020 2:38:50 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow FAILED

> Task :runners:twister2:runQuickstartJavaTwister2
Nov 12, 2020 2:39:03 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 12, 2020 2:39:03 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone

> Task :runners:direct-java:runMobileGamingJavaDirect
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:direct-java:runMobileGamingJavaDirect FAILED

> Task :runners:twister2:runQuickstartJavaTwister2
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 21s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/xp62htas2yzv4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1170

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1170/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-11216] Fix ReaderCache to not resume from a cached reader for work

[Kenneth Knowles] Exclude more KuduIO from checker since it crashes under Java 11 which

[noreply] Bump Gradle to 6.7 (#13148)


------------------------------------------
[...truncated 3.78 MB...]
INFO: Started 0 remote fetches in 0 ms
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 2 ms
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25). 5149 bytes result sent to driver
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27). 5149 bytes result sent to driver
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26). 5149 bytes result sent to driver
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27) in 47 ms on localhost (executor driver) (1/4)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26) in 47 ms on localhost (executor driver) (2/4)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25) in 49 ms on localhost (executor driver) (3/4)
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/.temp-beam-43ec8b90-8eef-45bb-85d3-641f8abf0f0b/dc78937e-acf2-462d-8b4c-22164a3a4da8, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@476a0959, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/counts-00000-of-00004
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/.temp-beam-43ec8b90-8eef-45bb-85d3-641f8abf0f0b/a2df2cc5-4115-4dc2-b5e2-6956059c8a03, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@476a0959, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/counts-00001-of-00004
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/.temp-beam-43ec8b90-8eef-45bb-85d3-641f8abf0f0b/c9470c5c-c240-46a0-8e27-bfd764d176c3, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@476a0959, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/counts-00002-of-00004
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/.temp-beam-43ec8b90-8eef-45bb-85d3-641f8abf0f0b/07ea331f-5abb-49c1-80b8-5703a3c9e886, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@476a0959, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/counts-00003-of-00004
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/.temp-beam-43ec8b90-8eef-45bb-85d3-641f8abf0f0b/c9470c5c-c240-46a0-8e27-bfd764d176c3
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/.temp-beam-43ec8b90-8eef-45bb-85d3-641f8abf0f0b/dc78937e-acf2-462d-8b4c-22164a3a4da8
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/.temp-beam-43ec8b90-8eef-45bb-85d3-641f8abf0f0b/07ea331f-5abb-49c1-80b8-5703a3c9e886
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/.temp-beam-43ec8b90-8eef-45bb-85d3-641f8abf0f0b/a2df2cc5-4115-4dc2-b5e2-6956059c8a03
Nov 11, 2020 10:32:42 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-93640529022458297-tmpdir/word-count-beam/.temp-beam-43ec8b90-8eef-45bb-85d3-641f8abf0f0b/].
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 118 ms on localhost (executor driver) (4/4)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.131 s
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:34267 (size: 7.3 KB, free: 13.5 GB)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 2 ms
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6453 bytes result sent to driver
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6453 bytes result sent to driver
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6453 bytes result sent to driver
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 16 ms on localhost (executor driver) (1/4)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 18 ms on localhost (executor driver) (2/4)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 19 ms on localhost (executor driver) (3/4)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 18 ms on localhost (executor driver) (4/4)
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.030 s
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.216734 s
Nov 11, 2020 10:32:42 PM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 11, 2020 10:32:42 PM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@2803aed{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 11, 2020 10:32:42 PM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-8437882512566309331-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-8437882512566309331-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 11, 2020 10:33:54 PM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 11, 2020 10:33:55 PM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED

> Task :runners:direct-java:runMobileGamingJavaDirect
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:direct-java:runMobileGamingJavaDirect FAILED

> Task :runners:twister2:runQuickstartJavaTwister2
Nov 11, 2020 10:34:11 PM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 11, 2020 10:34:12 PM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 13s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/b5crwkquvckxk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1169

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1169/display/redirect?page=changes>

Changes:

[Maximilian Michels] Refactor checkpointing configuration code

[Robert Bradshaw] [BEAM-9547] stubs for non-implemented IO.

[kwu] BEAM-11194: Add timer family id support in KeyedTimerData

[Maximilian Michels] [BEAM-9855] Provide an option to configure the Flink state backend

[Robert Bradshaw] Allow not/won't imlement helpers to be used for functions as well as

[frank] [BEAM-9804] Allow user configuration of BigQuery temporary dataset

[noreply] [BEAM-9980] Don't hardcode Python version in loadtests and make it

[noreply] [BEAM-8451] annotate python only sections (#11706)

[Kyle Weaver] [BEAM-9855] Fix merge conflict between #13116 and #13240.

[noreply] [BEAM-11033] Identify Dataflow metrics for portable job path based on


------------------------------------------
[...truncated 3.79 MB...]
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 128 ms on localhost (executor driver) (4/4)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.138 s
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:39665 (size: 7.3 KB, free: 13.5 GB)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6496 bytes result sent to driver
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6496 bytes result sent to driver
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6496 bytes result sent to driver
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 17 ms on localhost (executor driver) (1/4)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 19 ms on localhost (executor driver) (2/4)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 19 ms on localhost (executor driver) (3/4)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 19 ms on localhost (executor driver) (4/4)
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.033 s
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.226097 s
Nov 11, 2020 11:06:00 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 11, 2020 11:06:00 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@36d7a04f{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 11, 2020 11:06:00 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-6041893170543183190-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-6041893170543183190-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 11, 2020 11:07:06 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 11, 2020 11:07:07 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow FAILED

> Task :runners:direct-java:runQuickstartJavaDirect
Nov 11, 2020 11:07:15 AM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
INFO: Filepattern pom.xml matched 1 files with total size 16323
Nov 11, 2020 11:07:15 AM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern pom.xml into bundles of size 816 took 1 ms and produced 1 files and 20 bundles
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 1edfd206-8c43-422e-9fa4-59a43e7c34ff for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer b0f86c00-35b3-4940-a959-131b285ce300 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 18ebef82-e391-4014-9837-228f08219dfa for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 8542bc90-7c86-45be-b275-cc670c1ca311 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer c2968ab1-45d2-4ec1-91eb-ae61b5318fcf for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/1edfd206-8c43-422e-9fa4-59a43e7c34ff
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/b0f86c00-35b3-4940-a959-131b285ce300
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/18ebef82-e391-4014-9837-228f08219dfa
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/c2968ab1-45d2-4ec1-91eb-ae61b5318fcf
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/8542bc90-7c86-45be-b275-cc670c1ca311
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 5 file results
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 5.
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/1edfd206-8c43-422e-9fa4-59a43e7c34ff, shard=4, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/counts-00004-of-00005
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/18ebef82-e391-4014-9837-228f08219dfa, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/counts-00002-of-00005
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/c2968ab1-45d2-4ec1-91eb-ae61b5318fcf, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/counts-00001-of-00005
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/8542bc90-7c86-45be-b275-cc670c1ca311, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/counts-00003-of-00005
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/b0f86c00-35b3-4940-a959-131b285ce300, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@4ad366b5, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/counts-00000-of-00005
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/b0f86c00-35b3-4940-a959-131b285ce300
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/1edfd206-8c43-422e-9fa4-59a43e7c34ff
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/8542bc90-7c86-45be-b275-cc670c1ca311
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/c2968ab1-45d2-4ec1-91eb-ae61b5318fcf
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/18ebef82-e391-4014-9837-228f08219dfa
Nov 11, 2020 11:07:19 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-1653613067624275701-tmpdir/word-count-beam/.temp-beam-bb194005-d5c2-4c71-ae3d-8d02000c9c90/].
grep Foundation counts*
counts-00004-of-00005:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
Nov 11, 2020 11:07:22 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 11, 2020 11:07:22 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 7s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/4lbvr6qdz2exm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1168

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1168/display/redirect>

Changes:


------------------------------------------
[...truncated 3.77 MB...]
Nov 11, 2020 1:45:15 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2581139715719757996-tmpdir/word-count-beam/.temp-beam-cdc59aa7-1cf8-47bc-8214-39cb464d2338/49a6d58e-e48e-4347-ab99-83c06b8bcdb9
Nov 11, 2020 1:45:15 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2581139715719757996-tmpdir/word-count-beam/.temp-beam-cdc59aa7-1cf8-47bc-8214-39cb464d2338/660459f6-f24d-41ee-b0bc-6ab6cc56ac73
Nov 11, 2020 1:45:15 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2581139715719757996-tmpdir/word-count-beam/.temp-beam-cdc59aa7-1cf8-47bc-8214-39cb464d2338/b7c536ef-4223-4872-a1c9-d7a7494e438d
Nov 11, 2020 1:45:15 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-2581139715719757996-tmpdir/word-count-beam/.temp-beam-cdc59aa7-1cf8-47bc-8214-39cb464d2338/].
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 121 ms on localhost (executor driver) (4/4)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.133 s
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:36271 (size: 7.3 KB, free: 13.5 GB)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 2 ms
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 2 ms
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6496 bytes result sent to driver
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6496 bytes result sent to driver
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6496 bytes result sent to driver
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 20 ms on localhost (executor driver) (1/4)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 19 ms on localhost (executor driver) (2/4)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 18 ms on localhost (executor driver) (3/4)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 19 ms on localhost (executor driver) (4/4)
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.030 s
Nov 11, 2020 1:45:15 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.224158 s
Nov 11, 2020 1:45:15 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 11, 2020 1:45:16 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@52de714c{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 11, 2020 1:45:16 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 11, 2020 1:45:16 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 11, 2020 1:45:16 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 11, 2020 1:45:16 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 11, 2020 1:45:16 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 11, 2020 1:45:16 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 11, 2020 1:45:16 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-7559959951317682333-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-7559959951317682333-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 11, 2020 1:46:23 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 11, 2020 1:46:24 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow FAILED

> Task :runners:twister2:runQuickstartJavaTwister2
Nov 11, 2020 1:46:36 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 11, 2020 1:46:36 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:direct-java:runQuickstartJavaDirect
Nov 11, 2020 1:47:18 AM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
INFO: Filepattern pom.xml matched 1 files with total size 16323
Nov 11, 2020 1:47:18 AM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern pom.xml into bundles of size 816 took 1 ms and produced 1 files and 20 bundles
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer a7aa6965-27b5-4492-9d15-3c401d0f8318 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@54efbd8f pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 63917e6f-00b0-4043-8eba-06f0bb3b882f for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@54efbd8f pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer e9b545cd-2caa-4139-88fb-302500f89286 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@54efbd8f pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 4c7a09f2-70af-4a4d-9843-49c83e83cc74 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@54efbd8f pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/63917e6f-00b0-4043-8eba-06f0bb3b882f
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/e9b545cd-2caa-4139-88fb-302500f89286
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/4c7a09f2-70af-4a4d-9843-49c83e83cc74
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/a7aa6965-27b5-4492-9d15-3c401d0f8318
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/4c7a09f2-70af-4a4d-9843-49c83e83cc74, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@54efbd8f, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/counts-00001-of-00004
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/e9b545cd-2caa-4139-88fb-302500f89286, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@54efbd8f, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/counts-00003-of-00004
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/63917e6f-00b0-4043-8eba-06f0bb3b882f, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@54efbd8f, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/counts-00000-of-00004
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/a7aa6965-27b5-4492-9d15-3c401d0f8318, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@54efbd8f, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/counts-00002-of-00004
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/a7aa6965-27b5-4492-9d15-3c401d0f8318
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/4c7a09f2-70af-4a4d-9843-49c83e83cc74
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/e9b545cd-2caa-4139-88fb-302500f89286
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/63917e6f-00b0-4043-8eba-06f0bb3b882f
Nov 11, 2020 1:47:22 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-4510676981313516979-tmpdir/word-count-beam/.temp-beam-cd417433-81a4-4f4b-a3a9-9b0ad9966283/].
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 46s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/kf6barbcg6pg4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1167

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1167/display/redirect>

Changes:


------------------------------------------
[...truncated 3.77 MB...]
INFO: Adding task set 5.0 with 4 tasks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 5.0 (TID 24, localhost, executor driver, partition 0, NODE_LOCAL, 7927 bytes)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 5.0 (TID 25, localhost, executor driver, partition 1, PROCESS_LOCAL, 7927 bytes)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 5.0 (TID 26, localhost, executor driver, partition 2, PROCESS_LOCAL, 7927 bytes)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 5.0 (TID 27, localhost, executor driver, partition 3, PROCESS_LOCAL, 7927 bytes)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 5.0 (TID 25)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 5.0 (TID 24)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 5.0 (TID 26)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 5.0 (TID 27)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26). 5149 bytes result sent to driver
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25). 5149 bytes result sent to driver
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27). 5149 bytes result sent to driver
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25) in 43 ms on localhost (executor driver) (1/4)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27) in 43 ms on localhost (executor driver) (2/4)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26) in 44 ms on localhost (executor driver) (3/4)
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/.temp-beam-47a019d5-8d07-42b8-a21e-5c153858cba2/2eeef78b-fe56-417e-b276-edb672fd4695, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@767565a9, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/counts-00000-of-00004
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/.temp-beam-47a019d5-8d07-42b8-a21e-5c153858cba2/3627593c-8e1d-4d46-9af1-e0b466e8195e, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@767565a9, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/counts-00001-of-00004
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/.temp-beam-47a019d5-8d07-42b8-a21e-5c153858cba2/a34bfddd-72ca-4013-87ad-144876517e0b, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@767565a9, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/counts-00002-of-00004
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/.temp-beam-47a019d5-8d07-42b8-a21e-5c153858cba2/bdd8bedf-1e8e-4db0-b1b1-1a77c9d70a87, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@767565a9, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/counts-00003-of-00004
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/.temp-beam-47a019d5-8d07-42b8-a21e-5c153858cba2/bdd8bedf-1e8e-4db0-b1b1-1a77c9d70a87
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/.temp-beam-47a019d5-8d07-42b8-a21e-5c153858cba2/3627593c-8e1d-4d46-9af1-e0b466e8195e
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/.temp-beam-47a019d5-8d07-42b8-a21e-5c153858cba2/2eeef78b-fe56-417e-b276-edb672fd4695
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/.temp-beam-47a019d5-8d07-42b8-a21e-5c153858cba2/a34bfddd-72ca-4013-87ad-144876517e0b
Nov 10, 2020 11:49:27 PM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-193887247882427536-tmpdir/word-count-beam/.temp-beam-47a019d5-8d07-42b8-a21e-5c153858cba2/].
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 111 ms on localhost (executor driver) (4/4)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.120 s
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:43829 (size: 7.3 KB, free: 13.5 GB)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6453 bytes result sent to driver
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6453 bytes result sent to driver
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6496 bytes result sent to driver
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 17 ms on localhost (executor driver) (1/4)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 15 ms on localhost (executor driver) (2/4)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 16 ms on localhost (executor driver) (3/4)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 18 ms on localhost (executor driver) (4/4)
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.029 s
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.203331 s
Nov 10, 2020 11:49:27 PM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 10, 2020 11:49:27 PM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@6bb9ff09{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 10, 2020 11:49:27 PM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 10, 2020 11:49:28 PM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 10, 2020 11:49:28 PM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 10, 2020 11:49:28 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 10, 2020 11:49:28 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 10, 2020 11:49:28 PM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 10, 2020 11:49:28 PM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runQuickstartJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 54s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/qx4fvhux3svik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1166

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1166/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-8719 BEAM-8768 BEAM-8769 BEAM-8770 BEAM-8771] Update minor hadoop

[noreply] [BEAM-11075] Go SDK's synthetic source supports hot keys generation

[samuelw] Add a test for windowed side inputs that do not have a value before the

[noreply] [BEAM-10892] Remove redundant asterisk from the Kafka external table doc

[noreply] [BEAM-9547] Add basic support for `DataFrame.{eval,query}` (#13264)

[noreply] [BEAM-11188] Add wrappers to Go Xlang examples, and adjust front-end.


------------------------------------------
[...truncated 504.91 KB...]
INFO: Finished task 1.0 in stage 5.0 (TID 25) in 41 ms on localhost (executor driver) (1/4)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27) in 42 ms on localhost (executor driver) (2/4)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26) in 42 ms on localhost (executor driver) (3/4)
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/.temp-beam-fbc0263e-309c-40eb-bb39-2f4c0563188d/30c9a87a-5938-493c-899d-f8bb96eb258f, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@45f938ad, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/counts-00000-of-00004
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/.temp-beam-fbc0263e-309c-40eb-bb39-2f4c0563188d/253aeae5-1771-4198-9854-153f5a60123c, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@45f938ad, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/counts-00001-of-00004
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/.temp-beam-fbc0263e-309c-40eb-bb39-2f4c0563188d/e9a5600b-53f2-4d49-9f20-90e1859914d3, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@45f938ad, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/counts-00002-of-00004
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/.temp-beam-fbc0263e-309c-40eb-bb39-2f4c0563188d/d520c845-0698-4f7a-a81e-c78c780b6b7e, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@45f938ad, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/counts-00003-of-00004
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/.temp-beam-fbc0263e-309c-40eb-bb39-2f4c0563188d/30c9a87a-5938-493c-899d-f8bb96eb258f
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/.temp-beam-fbc0263e-309c-40eb-bb39-2f4c0563188d/e9a5600b-53f2-4d49-9f20-90e1859914d3
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/.temp-beam-fbc0263e-309c-40eb-bb39-2f4c0563188d/253aeae5-1771-4198-9854-153f5a60123c
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/.temp-beam-fbc0263e-309c-40eb-bb39-2f4c0563188d/d520c845-0698-4f7a-a81e-c78c780b6b7e
Nov 10, 2020 11:05:45 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-1835662629979271799-tmpdir/word-count-beam/.temp-beam-fbc0263e-309c-40eb-bb39-2f4c0563188d/].
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 105 ms on localhost (executor driver) (4/4)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.117 s
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:37673 (size: 7.3 KB, free: 13.5 GB)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6453 bytes result sent to driver
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6453 bytes result sent to driver
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6453 bytes result sent to driver
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 15 ms on localhost (executor driver) (1/4)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 15 ms on localhost (executor driver) (2/4)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 15 ms on localhost (executor driver) (3/4)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 18 ms on localhost (executor driver) (4/4)
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.026 s
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.196432 s
Nov 10, 2020 11:05:45 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 10, 2020 11:05:45 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@2adc9384{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 10, 2020 11:05:45 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-6309273016359667107-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-6309273016359667107-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 10, 2020 11:06:52 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 10, 2020 11:06:53 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED

> Task :runners:direct-java:runMobileGamingJavaDirect
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:direct-java:runMobileGamingJavaDirect FAILED

> Task :runners:twister2:runQuickstartJavaTwister2
Nov 10, 2020 11:07:06 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 10, 2020 11:07:06 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:1.10:runQuickstartJavaFlinkLocal'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 52s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/dkrnnzitfbwoi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1165

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1165/display/redirect>

Changes:


------------------------------------------
[...truncated 3.78 MB...]
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 4.0 (TID 20) in 15 ms on localhost (executor driver) (3/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 4.0 (TID 23) in 15 ms on localhost (executor driver) (4/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 4.0, whose tasks have all completed, from pool 
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 4 (repartition at GroupCombineFunctions.java:191) finished in 0.033 s
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ShuffleMapStage 5, ResultStage 6)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ShuffleMapStage 5 (MapPartitionsRDD[106] at repartition at GroupCombineFunctions.java:191), which has no missing parents
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_6 stored as values in memory (estimated size 26.3 KB, free 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_6_piece0 stored as bytes in memory (estimated size 11.5 KB, free 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_6_piece0 in memory on localhost:39483 (size: 11.5 KB, free: 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 6 from broadcast at DAGScheduler.scala:1184
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ShuffleMapStage 5 (MapPartitionsRDD[106] at repartition at GroupCombineFunctions.java:191) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 5.0 with 4 tasks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 5.0 (TID 24, localhost, executor driver, partition 0, NODE_LOCAL, 7927 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 5.0 (TID 25, localhost, executor driver, partition 1, PROCESS_LOCAL, 7927 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 5.0 (TID 26, localhost, executor driver, partition 2, PROCESS_LOCAL, 7927 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 5.0 (TID 27, localhost, executor driver, partition 3, PROCESS_LOCAL, 7927 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 5.0 (TID 24)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 5.0 (TID 25)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 5.0 (TID 26)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 5.0 (TID 27)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26). 5149 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27). 5149 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25). 5149 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25) in 41 ms on localhost (executor driver) (1/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26) in 40 ms on localhost (executor driver) (2/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27) in 40 ms on localhost (executor driver) (3/4)
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/a2c4021d-e627-44b3-9aa2-0b7946612187, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7cb5b771, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/counts-00000-of-00004
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/eda1c5db-edda-4457-8b52-031831b617da, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7cb5b771, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/counts-00001-of-00004
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/ab5e0046-01ea-4b20-8624-33be88fa1967, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7cb5b771, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/counts-00002-of-00004
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/46b90119-9cf9-4201-936a-541ccff84789, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7cb5b771, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/counts-00003-of-00004
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/46b90119-9cf9-4201-936a-541ccff84789
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/eda1c5db-edda-4457-8b52-031831b617da
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/ab5e0046-01ea-4b20-8624-33be88fa1967
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/a2c4021d-e627-44b3-9aa2-0b7946612187
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/].
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 104 ms on localhost (executor driver) (4/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.114 s
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:39483 (size: 7.3 KB, free: 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6453 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6453 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6453 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6453 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 14 ms on localhost (executor driver) (1/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 14 ms on localhost (executor driver) (2/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 16 ms on localhost (executor driver) (3/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 17 ms on localhost (executor driver) (4/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.026 s
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.189524 s
Nov 09, 2020 11:08:57 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 09, 2020 11:08:57 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@2221df2b{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 40s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/2yolp7tbrnlko

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1164

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1164/display/redirect>

Changes:


------------------------------------------
[...truncated 3.78 MB...]
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 89
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 15
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 60
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 69
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 13
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 6
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 61
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 79
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 34
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 14
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 74
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 41
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 52
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 83
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 73
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 70
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 2
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:42231 (size: 7.3 KB, free: 13.5 GB)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6496 bytes result sent to driver
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6496 bytes result sent to driver
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6496 bytes result sent to driver
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 14 ms on localhost (executor driver) (1/4)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 15 ms on localhost (executor driver) (2/4)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 15 ms on localhost (executor driver) (3/4)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 17 ms on localhost (executor driver) (4/4)
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.027 s
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.242122 s
Nov 08, 2020 11:05:50 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 08, 2020 11:05:50 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@1b4aca0d{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 08, 2020 11:05:50 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 08, 2020 11:05:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 08, 2020 11:05:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 08, 2020 11:05:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 08, 2020 11:05:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 08, 2020 11:05:51 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-6551973721783379329-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-6551973721783379329-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 08, 2020 11:06:56 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 08, 2020 11:06:57 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow FAILED

> Task :runners:direct-java:runQuickstartJavaDirect
Nov 08, 2020 11:07:09 AM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
INFO: Filepattern pom.xml matched 1 files with total size 16322
Nov 08, 2020 11:07:09 AM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern pom.xml into bundles of size 816 took 1 ms and produced 1 files and 20 bundles
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 240d92a3-0ac8-424f-bb26-e0511a8bc958 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@712a648 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer cafff8e6-bdd9-4f6f-96cc-c194d17684d7 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@712a648 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 51c48f46-14c2-41ba-8402-5b5f8e27738f for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@712a648 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/240d92a3-0ac8-424f-bb26-e0511a8bc958
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/51c48f46-14c2-41ba-8402-5b5f8e27738f
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/cafff8e6-bdd9-4f6f-96cc-c194d17684d7
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 3 file results
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 3.
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/240d92a3-0ac8-424f-bb26-e0511a8bc958, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@712a648, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/counts-00002-of-00003
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/51c48f46-14c2-41ba-8402-5b5f8e27738f, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@712a648, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/counts-00001-of-00003
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/cafff8e6-bdd9-4f6f-96cc-c194d17684d7, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@712a648, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/counts-00000-of-00003
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/51c48f46-14c2-41ba-8402-5b5f8e27738f
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/240d92a3-0ac8-424f-bb26-e0511a8bc958
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/cafff8e6-bdd9-4f6f-96cc-c194d17684d7
Nov 08, 2020 11:07:13 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-7528196991931487530-tmpdir/word-count-beam/.temp-beam-9e9e9088-6757-476e-b79c-f6fa4030f063/].
grep Foundation counts*
counts-00000-of-00003:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
Nov 08, 2020 11:07:10 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 08, 2020 11:07:10 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 54s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/voqdfcivp3lpo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1163

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1163/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10188] Include links in the release checklist to verify release is

[Kyle Weaver] [BEAM-10188] Make publishing its own step for extra clarity.

[noreply] [BEAM-11200] Update useragent version of Go SDK (#13279)

[Brian Hulette] listSubscription should remove TestPubsub's own sub


------------------------------------------
[...truncated 3.78 MB...]
Nov 07, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/.temp-beam-fb17d6c4-cc6f-4500-b908-6216337f8ed2/77ef277c-ebb8-4f30-b214-e53b1e6dd91b, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@6a0f36c2, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/counts-00001-of-00004
Nov 07, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/.temp-beam-fb17d6c4-cc6f-4500-b908-6216337f8ed2/e8bd591a-5cf4-4ef1-838b-b1668ce2254e, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@6a0f36c2, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/counts-00002-of-00004
Nov 07, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/.temp-beam-fb17d6c4-cc6f-4500-b908-6216337f8ed2/ed593ace-fc21-499e-8b29-f6fa4e635b53, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@6a0f36c2, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/counts-00003-of-00004
Nov 07, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/.temp-beam-fb17d6c4-cc6f-4500-b908-6216337f8ed2/ed593ace-fc21-499e-8b29-f6fa4e635b53
Nov 07, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/.temp-beam-fb17d6c4-cc6f-4500-b908-6216337f8ed2/77ef277c-ebb8-4f30-b214-e53b1e6dd91b
Nov 07, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/.temp-beam-fb17d6c4-cc6f-4500-b908-6216337f8ed2/e8bd591a-5cf4-4ef1-838b-b1668ce2254e
Nov 07, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/.temp-beam-fb17d6c4-cc6f-4500-b908-6216337f8ed2/794e3b85-0436-41c7-8874-457518e49beb
Nov 07, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-2889065222068766620-tmpdir/word-count-beam/.temp-beam-fb17d6c4-cc6f-4500-b908-6216337f8ed2/].
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 113 ms on localhost (executor driver) (4/4)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.124 s
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:40225 (size: 7.3 KB, free: 13.5 GB)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6496 bytes result sent to driver
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6496 bytes result sent to driver
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6496 bytes result sent to driver
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 15 ms on localhost (executor driver) (1/4)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 16 ms on localhost (executor driver) (2/4)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 17 ms on localhost (executor driver) (3/4)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 15 ms on localhost (executor driver) (4/4)
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.025 s
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.201460 s
Nov 07, 2020 11:05:56 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 07, 2020 11:05:56 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@5df28835{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 07, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-2130156778799337024-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-2130156778799337024-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 07, 2020 11:07:04 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 07, 2020 11:07:04 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow FAILED

> Task :runners:direct-java:runQuickstartJavaDirect
Nov 07, 2020 11:07:16 AM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
INFO: Filepattern pom.xml matched 1 files with total size 16322
Nov 07, 2020 11:07:16 AM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern pom.xml into bundles of size 816 took 0 ms and produced 1 files and 20 bundles
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer faa43013-3bb8-48bd-b7a4-28a5c5cbecea for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@2237a6d6 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 2bd5f546-39a1-4ff6-8660-6e1b2d00c9dc for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@2237a6d6 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer eb303f14-fd85-41c8-b398-f7d98ac1f5de for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@2237a6d6 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/faa43013-3bb8-48bd-b7a4-28a5c5cbecea
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/eb303f14-fd85-41c8-b398-f7d98ac1f5de
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/2bd5f546-39a1-4ff6-8660-6e1b2d00c9dc
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 3 file results
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 3.
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/eb303f14-fd85-41c8-b398-f7d98ac1f5de, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@2237a6d6, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/counts-00002-of-00003
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/faa43013-3bb8-48bd-b7a4-28a5c5cbecea, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@2237a6d6, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/counts-00000-of-00003
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/2bd5f546-39a1-4ff6-8660-6e1b2d00c9dc, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@2237a6d6, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/counts-00001-of-00003
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/faa43013-3bb8-48bd-b7a4-28a5c5cbecea
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/2bd5f546-39a1-4ff6-8660-6e1b2d00c9dc
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/eb303f14-fd85-41c8-b398-f7d98ac1f5de
Nov 07, 2020 11:07:20 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-5104009913884594215-tmpdir/word-count-beam/.temp-beam-607cf1ff-f234-4cb7-86eb-72b2d19133cc/].
grep Foundation counts*
counts-00000-of-00003:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
Nov 07, 2020 11:07:17 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 07, 2020 11:07:17 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 2s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/yxilpmbrwfh2y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1162

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1162/display/redirect?page=changes>

Changes:

[Pablo Estrada] Updating BigQuery client for Python

[Andrew Pilloud] [BEAM-11165] ZetaSQL Calc only convert referenced columns

[Robin Qiu] Support read/write ZetaSQL DATETIME/NUMERIC types from/to BigQuery

[Robin Qiu] Address comments

[Kenneth Knowles] Suppress nullness errors in new files since last round of suppressions

[Kenneth Knowles] Fix position of @Nullable annotations since last round

[Kenneth Knowles] Exclude nonexistent org.checkerframework:jdk8 from all configurations

[Kenneth Knowles] Fix nullness error in Kotlin WriteOneFilePerWindow

[Kenneth Knowles] Allow checkerframework on API surfaces

[Kenneth Knowles] Enable checkerframework globally

[je.ik] [BEAM-11191] fix ClassCastException when clearing watermark state

[noreply] [BEAM-3736] Let users know that CombineFn.setup and teardown are not

[noreply] [BEAM-11151] Adds the ToString well-known transform URN (#13214)

[noreply] Merge pull request #13164 from Refactoring BigQuery Read utilities into

[Robert Burke] Moving to 2.27.0-SNAPSHOT on master branch.

[Andrew Pilloud] [BEAM-11165] Use the ZetaSQL Streaming API synchronously

[noreply] [BEAM-11159] Use GCP pubsub client for TestPubsub (#13273)


------------------------------------------
[...truncated 3.78 MB...]
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26). 5149 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25). 5149 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27). 5149 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26) in 44 ms on localhost (executor driver) (1/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25) in 45 ms on localhost (executor driver) (2/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27) in 45 ms on localhost (executor driver) (3/4)
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/5ac80302-6cb3-41ba-8407-e12a67381145, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@59023e3, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/counts-00000-of-00004
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/24b867c2-b8b5-4ee2-b3d4-41b95169dfb8, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@59023e3, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/counts-00001-of-00004
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/f47c807a-28f8-4b57-856d-2af8f82e2519, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@59023e3, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/counts-00002-of-00004
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/37babf3d-f394-4103-b3fe-bed925835cbe, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@59023e3, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/counts-00003-of-00004
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/5ac80302-6cb3-41ba-8407-e12a67381145
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/37babf3d-f394-4103-b3fe-bed925835cbe
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/f47c807a-28f8-4b57-856d-2af8f82e2519
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/24b867c2-b8b5-4ee2-b3d4-41b95169dfb8
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/].
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 114 ms on localhost (executor driver) (4/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.127 s
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:45073 (size: 7.3 KB, free: 13.5 GB)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6453 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6453 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6453 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6453 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 17 ms on localhost (executor driver) (1/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 16 ms on localhost (executor driver) (2/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 17 ms on localhost (executor driver) (3/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 18 ms on localhost (executor driver) (4/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.030 s
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.208097 s
Nov 06, 2020 11:05:56 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 06, 2020 11:05:56 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@2ea562da{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:direct-java:runMobileGamingJavaDirect
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. Failed to construct instance from factory method DataflowRunner#fromOptions(interface org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

> Task :runners:direct-java:runMobileGamingJavaDirect FAILED
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder; -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED
[ERROR] Failed command

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-8920322553541514689-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-8920322553541514689-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 06, 2020 11:10:16 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 06, 2020 11:10:17 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.
Nov 06, 2020 11:10:28 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 06, 2020 11:10:28 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 12s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/gdic46w3qilq4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1161

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1161/display/redirect?page=changes>

Changes:

[Udi Meiri] [BEAM-2717] Implement ProtoCoder.to_type_hint

[Pablo Estrada] Adding display data to BQ File Loads transform

[Robert Bradshaw] Allow use of index as series.

[Robert Bradshaw] Allow setting columns.

[sychen] Fix GroupIntoBathces.test_buffering_timer_in_fixed_window_streaming

[Robert Bradshaw] Add utility to test a set of strings.

[Robert Bradshaw] Add a proxy for panda's top-level module functions.

[Robert Bradshaw] [BEAM-9547] Implement pd.concat().

[noreply] [BEAM-11091] Allow to specify coder for HadoopFormatIO.Read (#13166)

[noreply] [BEAM-11162] Fetch missing projectId from options (#13234)

[Kenneth Knowles] Add class-level suppression of rawtypes errors

[Kenneth Knowles] Enable rawtype errors globally

[noreply] [BEAM-3736] Add CombineFn.setup and CombineFn.teardown to Python SDK

[noreply] [BEAM-11190] Fix grouping on categorical columns (#13256)

[Robert Bradshaw] todo, lint

[noreply] [BEAM-3736] Disable CombineFnVisitor (#13266)

[noreply] [BEAM-11196] Set parent of fused stages to the lowest common ancestor


------------------------------------------
[...truncated 3.79 MB...]
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 28
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 44
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 24
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 35
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 73
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 92
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 23
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 63
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 85
Nov 05, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-6155051820837814332-tmpdir/word-count-beam/.temp-beam-99528c8e-53d5-4604-b1b8-defb0ed4e306/].
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed broadcast_3_piece0 on localhost:36419 in memory (size: 12.3 KB, free: 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 5
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 21
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 6
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12900 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed broadcast_1_piece0 on localhost:36419 in memory (size: 9.8 KB, free: 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 141 ms on localhost (executor driver) (4/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 100
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) finished in 0.155 s
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 33
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 51
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 43
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 3
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 11
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 42
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 101
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 66
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 94
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no missing parents
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 54
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 91
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 74
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 38
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 47
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 72
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 32
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 50
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 56
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 31
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 41
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 17
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 96
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 87
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 95
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 60
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 13
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 9
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 88
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, free 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed broadcast_0_piece0 on localhost:36419 in memory (size: 8.7 KB, free: 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 KB, free 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:36419 (size: 7.3 KB, free: 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 8
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 90
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 40
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, partition 0, NODE_LOCAL, 7938 bytes)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, partition 1, NODE_LOCAL, 7938 bytes)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, partition 2, NODE_LOCAL, 7938 bytes)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, partition 3, NODE_LOCAL, 7938 bytes)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6496 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6496 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6496 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 18 ms on localhost (executor driver) (1/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 18 ms on localhost (executor driver) (2/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 18 ms on localhost (executor driver) (3/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 20 ms on localhost (executor driver) (4/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.032 s
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.255782 s
Nov 05, 2020 11:08:57 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 05, 2020 11:08:57 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@63152571{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 41s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/cn6bfgmtnwll4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #1160

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1160/display/redirect?page=changes>

Changes:

[tysonjh] [BEAM-11130] Exclude OrderedListState category for Dataflow V2.

[samuelw] [BEAM-11144] Fix trigger prefetching so that the correct trigger index

[noreply] [BEAM-11146] Add fasterCopy option to Flink runner (#13240)

[noreply] [BEAM-10123] Add commit transform. (#12572)

[noreply] [BEAM-5504] Change Pubsub avro table jira task number in CHANGES.md

[noreply] [BEAM-5570] Update javacc dependency (#13094)

[Boyuan Zhang] Exclude SDF test suite because it requires support of self-checkpoint.

[tysonjh] Add Dataflow Runner V2 ValidatesRunner streaming test configuration.

[noreply] Implementing Python Bounded Source Reader DoFn (#13154)

[noreply] [BEAM-11164] Fixes bug in beam.Partition (#13236)

[noreply] [BEAM-10409] Remap all PCollections in KeyWithNone elimination (#13204)

[noreply] [BEAM-10124] Compute number of records before each offset using a

[noreply] [BEAM-10869] Remove unused PubSubSink with_attributes property (#13254)


------------------------------------------
[...truncated 3.78 MB...]
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (6/16) (79c884e3865344d26cfbb1d33c1ff7b9) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (11/16) (413a23898c6b9c8121bcbdc43dc9c879) switched from DEPLOYING to RUNNING.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (14/16) (0e572c591cc77d3bf143784f85b4d7da) switched from DEPLOYING to RUNNING.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (5/16) (2849c744ba7a92082aa2f0ecf899aeda) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (10/16) (351622952884742cde40c8d58e90fa10) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (8/16) (568b771ef9fb6bfa11aae5a4f3772d50) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (13/16) (0a0eeac1d7ee29a889cb29374ec2fc46) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (15/16) (a081511269ebb352d0242152e8dcb6e7) switched from DEPLOYING to RUNNING.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (12/16) (c9e6dd971ae29a477e79e383eb96fd20) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (11/16) (413a23898c6b9c8121bcbdc43dc9c879) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (9/16) (2b2c5c1313b55371e4ec27649b37431a) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (14/16) (0e572c591cc77d3bf143784f85b4d7da) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (15/16) (a081511269ebb352d0242152e8dcb6e7) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (16/16) (3b7619734d1ccc8f23bed0ddf23ac04d) switched from DEPLOYING to RUNNING.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.Execution transitionState
INFO: DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@a706d6a) (16/16) (3b7619734d1ccc8f23bed0ddf23ac04d) switched from RUNNING to FINISHED.
Nov 04, 2020 11:07:47 AM org.apache.flink.runtime.executiongraph.ExecutionGraph transitionState
INFO: Job wordcount-jenkins-1104110739-23313859 (fecb998eb35d4463a7528e5bc6d4bb88) switched from state RUNNING to FINISHED.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.minicluster.MiniCluster closeAsync
INFO: Shutting down Flink Mini Cluster
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.TaskExecutor onStop
INFO: Stopping TaskExecutor akka://flink/user/taskmanager_0.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.rest.RestServerEndpoint closeAsync
INFO: Shutting down rest endpoint.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.dispatcher.Dispatcher jobReachedGloballyTerminalState
INFO: Job fecb998eb35d4463a7528e5bc6d4bb88 reached globally terminal state FINISHED.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.TaskExecutor closeResourceManagerConnection
INFO: Close ResourceManager connection 68075e9141c814e36774063dee025b30.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.jobmaster.JobMaster onStop
INFO: Stopping the JobMaster for job wordcount-jenkins-1104110739-23313859(fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.resourcemanager.ResourceManager closeTaskManagerConnection
INFO: Closing TaskExecutor connection c362b6a3-bb20-4cf2-90f1-3a9192b1e577 because: The TaskExecutor is shutting down.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:5, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 8c5df4fab6bd85932b0d33794674982f, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl suspend
INFO: Suspending SlotPool.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.jobmaster.JobMaster dissolveResourceManagerConnection
INFO: Close ResourceManager connection 68075e9141c814e36774063dee025b30: JobManager is shutting down..
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl close
INFO: Stopping SlotPool.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.resourcemanager.ResourceManager closeJobManagerConnection
INFO: Disconnect job manager 8268ed847fe351e3310a8d6525e1488c@akka://flink/user/jobmanager_1 for job fecb998eb35d4463a7528e5bc6d4bb88 from the resource manager.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl jobManagerLostLeadership
INFO: JobManager for job fecb998eb35d4463a7528e5bc6d4bb88 with leader id 8268ed847fe351e3310a8d6525e1488c lost leadership.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:3, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 8c5ecc12fc951a7021cf28472f109138, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:4, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 279abb9be5a22c0cb33db4e4091c0566, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:6, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: c3552fdef6ffb3c8c6ff67149a5b3161, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:11, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 8c7485703f7d223eb58edbf6df7bf1a9, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:8, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: a3a3eecb4816b45a2da7aa7f26432bf0, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:9, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 403b502aa674960631680355becbe9ba, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:2, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 1de377b68a81d2d60f8be64659481117, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:13, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 0d6f374b26e1b8db76e714e527e80d45, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:12, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 907e35f6a9d6976441f038ef55455d32, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:14, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 8e10447fe906901f34946a17f6de55f9, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 8a5c0c89aefd95ac84028ad6fd39eb7f, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 7e5718c9326a7ee3b56fec759d066e3d, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:7, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 6e518b6a4c115ae24f26cd804d8cc2be, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:10, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: c32605ccf5f6566760c54e8d797e0470, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:15, state:ACTIVE, resource profile: ResourceProfile{managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: e02814365b760fc21b70eea40c9ccffc, jobId: fecb998eb35d4463a7528e5bc6d4bb88).
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.TaskExecutor closeJobManagerConnection
INFO: Close JobManager connection for job fecb998eb35d4463a7528e5bc6d4bb88.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.TaskExecutor closeJobManagerConnection
INFO: Close JobManager connection for job fecb998eb35d4463a7528e5bc6d4bb88.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.JobLeaderService stop
INFO: Stop job leader service.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown
INFO: Shutting down TaskExecutorLocalStateStoresManager.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint lambda$shutDownInternal$5
INFO: Removing cache directory /tmp/flink-web-ui
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.rest.RestServerEndpoint lambda$closeAsync$1
INFO: Shut down complete.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0
INFO: FileChannelManager removed spill file directory /tmp/flink-io-730826c1-5d28-497f-8137-09157ae61402
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close
INFO: Shutting down the network environment and its components.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0
INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-4df59409-d00d-4a47-a19a-a3488c0808e7
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.KvStateService shutdown
INFO: Shutting down the kvState service and its components.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.JobLeaderService stop
INFO: Stop job leader service.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.filecache.FileCache shutdown
INFO: removed file cache directory /tmp/flink-dist-cache-f506e247-42b6-49fa-b13f-607d42de2a6b
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException
INFO: Stopped TaskExecutor akka://flink/user/taskmanager_0.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.resourcemanager.ResourceManager deregisterApplication
INFO: Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent closeAsyncInternal
INFO: Closing components.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess closeInternal
INFO: Stopping SessionDispatcherLeaderProcess.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.dispatcher.Dispatcher onStop
INFO: Stopping dispatcher akka://flink/user/dispatcher.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.dispatcher.Dispatcher terminateJobManagerRunners
INFO: Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl close
INFO: Closing the SlotManager.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl suspend
INFO: Suspending the SlotManager.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.rest.handler.legacy.backpressure.BackPressureRequestCoordinator shutDown
INFO: Shutting down back pressure request coordinator.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$1
INFO: Stopped dispatcher akka://flink/user/dispatcher.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService
INFO: Stopping Akka RPC service.
Nov 04, 2020 11:07:48 AM org.apache.beam.runners.flink.FlinkRunner createPipelineResult
INFO: Execution finished in 3644 msecs
Nov 04, 2020 11:07:48 AM org.apache.beam.runners.flink.FlinkRunner createPipelineResult
INFO: Final accumulator values:
Nov 04, 2020 11:07:48 AM org.apache.beam.runners.flink.FlinkRunner createPipelineResult
INFO: __metricscontainers : {
  "metrics": {
    "attempted": [{
      "urn": "beam:metric:user:sum_int64:v1",
      "type": "beam:metrics:sum_int64:v1",
      "payload": "Ig==",
      "labels": {
        "NAMESPACE": "org.apache.beam.examples.WordCount$ExtractWordsFn",
        "NAME": "emptyLines",
        "PTRANSFORM": "WordCount.CountWords/ParDo(ExtractWords)/ParMultiDo(ExtractWords)"
      }
    }, {
      "urn": "beam:metric:user:distribution_int64:v1",
      "type": "beam:metrics:distribution_int64:v1",
      "payload": "2wPnewBz",
      "labels": {
        "NAMESPACE": "org.apache.beam.examples.WordCount$ExtractWordsFn",
        "NAME": "lineLenDistro",
        "PTRANSFORM": "WordCount.CountWords/ParDo(ExtractWords)/ParMultiDo(ExtractWords)"
      }
    }]
  }
}
Nov 04, 2020 11:07:48 AM akka.event.slf4j.Slf4jLogger$$anonfun$receive$1$$anonfun$applyOrElse$3 apply$mcV$sp
INFO: Shutting down remote daemon.
Nov 04, 2020 11:07:48 AM akka.event.slf4j.Slf4jLogger$$anonfun$receive$1$$anonfun$applyOrElse$3 apply$mcV$sp
INFO: Remote daemon shut down; proceeding with flushing remote transports.
Nov 04, 2020 11:07:48 AM akka.event.slf4j.Slf4jLogger$$anonfun$receive$1$$anonfun$applyOrElse$3 apply$mcV$sp
INFO: Remoting shut down.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService
INFO: Stopping Akka RPC service.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$5
INFO: Stopped Akka RPC service.
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.blob.AbstractBlobCache close
INFO: Shutting down BLOB cache
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.blob.AbstractBlobCache close
INFO: Shutting down BLOB cache
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.blob.BlobServer close
INFO: Stopped BLOB server at 0.0.0.0:38877
Nov 04, 2020 11:07:48 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$5
INFO: Stopped Akka RPC service.
grep Foundation counts*
counts-00013-of-00016:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-6492822558872845770-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tmp/groovy-generated-6492822558872845770-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 04, 2020 11:10:16 AM org.apache.beam.runners.twister2.Twister2Runner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage {} files. Enable logging at DEBUG level to see which files will be staged376
Nov 04, 2020 11:10:16 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.
Nov 04, 2020 11:10:28 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 04, 2020 11:10:28 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration: /tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 13s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/usf2keevkemtc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org