You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/11/21 11:27:29 UTC

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #806

See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/806/display/redirect?page=changes>

Changes:

[M.Yzvenn] fix - 1MB is interpreted as 1000, not 1024

[katarzyna.kucharczyk] [BEAM-6335] Added streaming GroupByKey Test that reads SyntheticSource

[katarzyna.kucharczyk] [BEAM-6335] Changed SyntheticDataPublisher to publish String UTF values

[katarzyna.kucharczyk] [BEAM-6335] Added custom PubSub Matcher that stops pipeline after

[robertwb] [BEAM-8629] Don't return mutable class type hints.

[valentyn] Restore original behavior of evaluating worker host on Windows until a

[echauchot] [BEAM-8470] Add an empty spark-structured-streaming runner project

[echauchot] [BEAM-8470] Fix missing dep

[echauchot] [BEAM-8470] Add SparkPipelineOptions

[echauchot] [BEAM-8470] Start pipeline translation

[echauchot] [BEAM-8470] Add global pipeline translation structure

[echauchot] [BEAM-8470] Add nodes translators structure

[echauchot] [BEAM-8470] Wire node translators with pipeline translator

[echauchot] [BEAM-8470] Renames: better differenciate pipeline translator for

[echauchot] [BEAM-8470] Organise methods in PipelineTranslator

[echauchot] [BEAM-8470] Initialise BatchTranslationContext

[echauchot] [BEAM-8470] Refactoring: -move batch/streaming common translation

[echauchot] [BEAM-8470] Make transform translation clearer: renaming, comments

[echauchot] [BEAM-8470] Improve javadocs

[echauchot] [BEAM-8470] Move SparkTransformOverrides to correct package

[echauchot] [BEAM-8470] Move common translation context components to superclass

[echauchot] [BEAM-8470] apply spotless

[echauchot] [BEAM-8470] Make codestyle and firebug happy

[echauchot] [BEAM-8470] Add TODOs

[echauchot] [BEAM-8470] Post-pone batch qualifier in all classes names for

[echauchot] [BEAM-8470] Add precise TODO for multiple TransformTranslator per

[echauchot] [BEAM-8470] Added SparkRunnerRegistrar

[echauchot] [BEAM-8470] Add basic pipeline execution. Refactor translatePipeline()

[echauchot] [BEAM-8470] Create PCollections manipulation methods

[echauchot] [BEAM-8470] Create Datasets manipulation methods

[echauchot] [BEAM-8470] Add Flatten transformation translator

[echauchot] [BEAM-8470] Add primitive GroupByKeyTranslatorBatch implementation

[echauchot] [BEAM-8470] Use Iterators.transform() to return Iterable

[echauchot] [BEAM-8470] Implement read transform

[echauchot] [BEAM-8470] update TODO

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] start source instanciation

[echauchot] [BEAM-8470] Improve exception flow

[echauchot] [BEAM-8470] Improve type enforcement in ReadSourceTranslator

[echauchot] [BEAM-8470] Experiment over using spark Catalog to pass in Beam Source

[echauchot] [BEAM-8470] Add source mocks

[echauchot] [BEAM-8470] fix mock, wire mock in translators and create a main test.

[echauchot] [BEAM-8470] Use raw WindowedValue so that spark Encoders could work

[echauchot] [BEAM-8470] clean deps

[echauchot] [BEAM-8470] Move DatasetSourceMock to proper batch mode

[echauchot] [BEAM-8470] Run pipeline in batch mode or in streaming mode

[echauchot] [BEAM-8470] Split batch and streaming sources and translators

[echauchot] [BEAM-8470] Use raw Encoder<WindowedValue> also in regular

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Add ReadSourceTranslatorStreaming

[echauchot] [BEAM-8470] Move Source and translator mocks to a mock package.

[echauchot] [BEAM-8470] Pass Beam Source and PipelineOptions to the spark DataSource

[echauchot] [BEAM-8470] Refactor DatasetSource fields

[echauchot] [BEAM-8470] Wire real SourceTransform and not mock and update the test

[echauchot] [BEAM-8470] Add missing 0-arg public constructor

[echauchot] [BEAM-8470] Use new PipelineOptionsSerializationUtils

[echauchot] [BEAM-8470] Apply spotless and fix  checkstyle

[echauchot] [BEAM-8470] Add a dummy schema for reader

[echauchot] [BEAM-8470] Add empty 0-arg constructor for mock source

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Checkstyle and Findbugs

[echauchot] [BEAM-8470] Refactor SourceTest to a UTest instaed of a main

[echauchot] [BEAM-8470] Fix pipeline triggering: use a spark action instead of

[echauchot] [BEAM-8470] improve readability of options passing to the source

[echauchot] [BEAM-8470] Clean unneeded fields in DatasetReader

[echauchot] [BEAM-8470] Fix serialization issues

[echauchot] [BEAM-8470] Add SerializationDebugger

[echauchot] [BEAM-8470] Add serialization test

[echauchot] [BEAM-8470] Move SourceTest to same package as tested class

[echauchot] [BEAM-8470] Fix SourceTest

[echauchot] [BEAM-8470] Simplify beam reader creation as it created once the source

[echauchot] [BEAM-8470] Put all transform translators Serializable

[echauchot] [BEAM-8470] Enable test mode

[echauchot] [BEAM-8470] Enable gradle build scan

[echauchot] [BEAM-8470] Add flatten test

[echauchot] [BEAM-8470] First attempt for ParDo primitive implementation

[echauchot] [BEAM-8470] Serialize windowedValue to byte[] in source to be able to

[echauchot] [BEAM-8470] Comment schema choices

[echauchot] [BEAM-8470] Fix errorprone

[echauchot] [BEAM-8470] Fix testMode output to comply with new binary schema

[echauchot] [BEAM-8470] Cleaning

[echauchot] [BEAM-8470] Remove bundleSize parameter and always use spark default

[echauchot] [BEAM-8470] Fix split bug

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Add ParDoTest

[echauchot] [BEAM-8470] Address minor review notes

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Add GroupByKeyTest

[echauchot] [BEAM-8470] Add comments and TODO to GroupByKeyTranslatorBatch

[echauchot] [BEAM-8470] Fix type checking with Encoder of WindowedValue<T>

[echauchot] [BEAM-8470] Port latest changes of ReadSourceTranslatorBatch to

[echauchot] [BEAM-8470] Remove no more needed putDatasetRaw

[echauchot] [BEAM-8470] Add ComplexSourceTest

[echauchot] [BEAM-8470] Fail in case of having SideInouts or State/Timers

[echauchot] [BEAM-8470] Fix Encoders: create an Encoder for every manipulated type

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] Fixed Javadoc error

[echauchot] [BEAM-8470] Rename SparkSideInputReader class and rename pruneOutput()

[echauchot] [BEAM-8470] Don't use deprecated

[echauchot] [BEAM-8470] Simplify logic of ParDo translator

[echauchot] [BEAM-8470] Fix kryo issue in GBK translator with a workaround

[echauchot] [BEAM-8470] Rename SparkOutputManager for consistency

[echauchot] [BEAM-8470] Fix for test elements container in GroupByKeyTest

[echauchot] [BEAM-8470] Added "testTwoPardoInRow"

[echauchot] [BEAM-8470] Add a test for the most simple possible Combine

[echauchot] [BEAM-8470] Rename SparkDoFnFilterFunction to DoFnFilterFunction for

[echauchot] [BEAM-8470] Generalize the use of SerializablePipelineOptions in place

[echauchot] [BEAM-8470] Fix getSideInputs

[echauchot] [BEAM-8470] Extract binary schema creation in a helper class

[echauchot] [BEAM-8470] First version of combinePerKey

[echauchot] [BEAM-8470] Improve type checking of Tuple2 encoder

[echauchot] [BEAM-8470] Introduce WindowingHelpers (and helpers package) and use it

[echauchot] [BEAM-8470] Fix combiner using KV as input, use binary encoders in place

[echauchot] [BEAM-8470] Add combinePerKey and CombineGlobally tests

[echauchot] [BEAM-8470] Introduce RowHelpers

[echauchot] [BEAM-8470] Add CombineGlobally translation to avoid translating

[echauchot] [BEAM-8470] Cleaning

[echauchot] [BEAM-8470] Get back to classes in translators resolution because URNs

[echauchot] [BEAM-8470] Fix various type checking issues in Combine.Globally

[echauchot] [BEAM-8470] Update test with Long

[echauchot] [BEAM-8470] Fix combine. For unknown reason GenericRowWithSchema is used

[echauchot] [BEAM-8470] Use more generic Row instead of GenericRowWithSchema

[echauchot] [BEAM-8470] Add explanation about receiving a Row as input in the

[echauchot] [BEAM-8470] Fix encoder bug in combinePerkey

[echauchot] [BEAM-8470] Cleaning

[echauchot] [BEAM-8470] Implement WindowAssignTranslatorBatch

[echauchot] [BEAM-8470] Implement WindowAssignTest

[echauchot] [BEAM-8470] Fix javadoc

[echauchot] [BEAM-8470] Added SideInput support

[echauchot] [BEAM-8470] Fix CheckStyle violations

[echauchot] [BEAM-8470] Don't use Reshuffle translation

[echauchot] [BEAM-8470] Added using CachedSideInputReader

[echauchot] [BEAM-8470] Added TODO comment for ReshuffleTranslatorBatch

[echauchot] [BEAM-8470] And unchecked warning suppression

[echauchot] [BEAM-8470] Add streaming source initialisation

[echauchot] [BEAM-8470] Implement first streaming source

[echauchot] [BEAM-8470] Add a TODO on spark output modes

[echauchot] [BEAM-8470] Add transformators registry in PipelineTranslatorStreaming

[echauchot] [BEAM-8470] Add source streaming test

[echauchot] [BEAM-8470] Specify checkpointLocation at the pipeline start

[echauchot] [BEAM-8470] Clean unneeded 0 arg constructor in batch source

[echauchot] [BEAM-8470] Clean streaming source

[echauchot] [BEAM-8470] Continue impl of offsets for streaming source

[echauchot] [BEAM-8470] Deal with checkpoint and offset based read

[echauchot] [BEAM-8470] Apply spotless and fix spotbugs warnings

[echauchot] [BEAM-8470] Disable never ending test

[echauchot] [BEAM-8470] Fix access level issues, typos and modernize code to Java 8

[echauchot] [BEAM-8470] Merge Spark Structured Streaming runner into main Spark

[echauchot] [BEAM-8470] Fix non-vendored imports from Spark Streaming Runner classes

[echauchot] [BEAM-8470] Pass doFnSchemaInformation to ParDo batch translation

[echauchot] [BEAM-8470] Fix spotless issues after rebase

[echauchot] [BEAM-8470] Fix logging levels in Spark Structured Streaming translation

[echauchot] [BEAM-8470] Add SparkStructuredStreamingPipelineOptions and

[echauchot] [BEAM-8470] Rename SparkPipelineResult to

[echauchot] [BEAM-8470] Use PAssert in Spark Structured Streaming transform tests

[echauchot] [BEAM-8470] Ignore spark offsets (cf javadoc)

[echauchot] [BEAM-8470] implement source.stop

[echauchot] [BEAM-8470] Update javadoc

[echauchot] [BEAM-8470] Apply Spotless

[echauchot] [BEAM-8470] Enable batch Validates Runner tests for Structured Streaming

[echauchot] [BEAM-8470] Limit the number of partitions to make tests go 300% faster

[echauchot] [BEAM-8470] Fixes ParDo not calling setup and not tearing down if

[echauchot] [BEAM-8470] Pass transform based doFnSchemaInformation in ParDo

[echauchot] [BEAM-8470] Consider null object case on RowHelpers, fixes empty side

[echauchot] [BEAM-8470] Put back batch/simpleSourceTest.testBoundedSource

[echauchot] [BEAM-8470] Update windowAssignTest

[echauchot] [BEAM-8470] Add comment about checkpoint mark

[echauchot] [BEAM-8470] Re-code GroupByKeyTranslatorBatch to conserve windowing

[echauchot] [BEAM-8470] re-enable reduceFnRunner timers for output

[echauchot] [BEAM-8470] Improve visibility of debug messages

[echauchot] [BEAM-8470] Add a test that GBK preserves windowing

[echauchot] [BEAM-8470] Add TODO in Combine translations

[echauchot] [BEAM-8470] Update KVHelpers.extractKey() to deal with WindowedValue and

[echauchot] [BEAM-8470] Fix comment about schemas

[echauchot] [BEAM-8470] Implement reduce part of CombineGlobally translation with

[echauchot] [BEAM-8470] Output data after combine

[echauchot] [BEAM-8470] Implement merge accumulators part of CombineGlobally

[echauchot] [BEAM-8470] Fix encoder in combine call

[echauchot] [BEAM-8470] Revert extractKey while combinePerKey is not done (so that

[echauchot] [BEAM-8470] Apply a groupByKey avoids for some reason that the spark

[echauchot] [BEAM-8470] Fix case when a window does not merge into any other window

[echauchot] [BEAM-8470] Fix wrong encoder in combineGlobally GBK

[echauchot] [BEAM-8470] Fix bug in the window merging logic

[echauchot] [BEAM-8470] Remove the mapPartition that adds a key per partition

[echauchot] [BEAM-8470] Remove CombineGlobally translation because it is less

[echauchot] [BEAM-8470] Now that there is only Combine.PerKey translation, make only

[echauchot] [BEAM-8470] Clean no more needed KVHelpers

[echauchot] [BEAM-8470] Clean not more needed RowHelpers

[echauchot] [BEAM-8470] Clean not more needed WindowingHelpers

[echauchot] [BEAM-8470] Fix javadoc of AggregatorCombiner

[echauchot] [BEAM-8470] Fixed immutable list bug

[echauchot] [BEAM-8470] add comment in combine globally test

[echauchot] [BEAM-8470] Clean groupByKeyTest

[echauchot] [BEAM-8470] Add a test that combine per key preserves windowing

[echauchot] [BEAM-8470] Ignore for now not working test testCombineGlobally

[echauchot] [BEAM-8470] Add metrics support in DoFn

[echauchot] [BEAM-8470] Add missing dependencies to run Spark Structured Streaming

[echauchot] [BEAM-8470] Add setEnableSparkMetricSinks() method

[echauchot] [BEAM-8470] Fix javadoc

[echauchot] [BEAM-8470] Fix accumulators initialization in Combine that prevented

[echauchot] [BEAM-8470] Add a test to check that CombineGlobally preserves windowing

[echauchot] [BEAM-8470] Persist all output Dataset if there are multiple outputs in

[echauchot] [BEAM-8470] Added metrics sinks and tests

[echauchot] [BEAM-8470] Make spotless happy

[echauchot] [BEAM-8470] Add PipelineResults to Spark structured streaming.

[echauchot] [BEAM-8470] Update log4j configuration

[echauchot] [BEAM-8470] Add spark execution plans extended debug messages.

[echauchot] [BEAM-8470] Print number of leaf datasets

[echauchot] [BEAM-8470] fixup! Add PipelineResults to Spark structured streaming.

[echauchot] [BEAM-8470] Remove no more needed AggregatorCombinerPerKey (there is

[echauchot] [BEAM-8470] After testing performance and correctness, launch pipeline

[echauchot] [BEAM-8470] Improve Pardo translation performance: avoid calling a

[echauchot] [BEAM-8470] Use "sparkMaster" in local mode to obtain number of shuffle

[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using

[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using

[echauchot] [BEAM-8470] type erasure: spark encoders require a Class<T>, pass Object

[echauchot] [BEAM-8470] Fix scala Product in Encoders to avoid StackEverflow

[echauchot] [BEAM-8470] Conform to spark ExpressionEncoders: pass classTags,

[echauchot] [BEAM-8470] Add a simple spark native test to test Beam coders wrapping

[echauchot] [BEAM-8470] Fix code generation in Beam coder wrapper

[echauchot] [BEAM-8470] Lazy init coder because coder instance cannot be

[echauchot] [BEAM-8470] Fix warning in coder construction by reflexion

[echauchot] [BEAM-8470] Fix ExpressionEncoder generated code: typos, try catch, fqcn

[echauchot] [BEAM-8470] Fix getting the output value in code generation

[echauchot] [BEAM-8470] Fix beam coder lazy init using reflexion: use .clas + try

[echauchot] [BEAM-8470] Remove lazy init of beam coder because there is no generic

[echauchot] [BEAM-8470] Remove example code

[echauchot] [BEAM-8470] Fix equal and hashcode

[echauchot] [BEAM-8470] Fix generated code: uniform exceptions catching, fix

[echauchot] [BEAM-8470] Add an assert of equality in the encoders test

[echauchot] [BEAM-8470] Apply spotless and checkstyle and add javadocs

[echauchot] [BEAM-8470] Wrap exceptions in UserCoderExceptions

[echauchot] [BEAM-8470] Put Encoders expressions serializable

[echauchot] [BEAM-8470] Catch Exception instead of IOException because some coders

[echauchot] [BEAM-8470] Apply new Encoders to CombinePerKey

[echauchot] [BEAM-8470] Apply new Encoders to Read source

[echauchot] [BEAM-8470] Improve performance of source: the mapper already calls

[echauchot] [BEAM-8470] Ignore long time failing test: SparkMetricsSinkTest

[echauchot] [BEAM-8470] Apply new Encoders to Window assign translation

[echauchot] [BEAM-8470] Apply new Encoders to AggregatorCombiner

[echauchot] [BEAM-8470] Create a Tuple2Coder to encode scala tuple2

[echauchot] [BEAM-8470] Apply new Encoders to GroupByKey

[echauchot] [BEAM-8470] Apply new Encoders to Pardo. Replace Tuple2Coder with

[echauchot] [BEAM-8470] Apply spotless, fix typo and javadoc

[echauchot] [BEAM-8470] Use beam encoders also in the output of the source

[echauchot] [BEAM-8470] Remove unneeded cast

[echauchot] [BEAM-8470] Fix: Remove generic hack of using object. Use actual Coder

[echauchot] [BEAM-8470] Remove Encoders based on kryo now that we call Beam coders

[echauchot] [BEAM-8470] Add a jenkins job for validates runner tests in the new

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] Rebase on master: pass sideInputMapping in SimpleDoFnRunner

[echauchot] Fix SpotBugs

[echauchot] [BEAM-8470] simplify coders in combinePerKey translation

[echauchot] [BEAM-8470] Fix combiner. Do not reuse instance of accumulator

[echauchot] [BEAM-8470] input windows can arrive exploded (for sliding windows). As

[echauchot] [BEAM-8470] Add a combine test with sliding windows

[echauchot] [BEAM-8470] Add a test to test combine translation on binaryCombineFn

[echauchot] [BEAM-8470] Fix tests: use correct

[echauchot] [BEAM-8470] Fix wrong expected results in

[echauchot] [BEAM-8470] Add disclaimers about this runner being experimental

[echauchot] [BEAM-8470] Fix: create an empty accumulator in

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] Add a countPerElement test with sliding windows

[echauchot] [BEAM-8470] Fix the output timestamps of combine: timestamps must be

[echauchot] [BEAM-8470] set log level to info to avoid resource consumption in

[echauchot] [BEAM-8470] Fix CombineTest.testCountPerElementWithSlidingWindows

[aromanenko.dev] [BEAM-8470] Remove "validatesStructuredStreamingRunnerBatch" from

[echauchot] [BEAM-8470] Fix timestamps in combine output: assign the timestamp to

[valentyn] Guard pickling operations with a lock to prevent race condition in

[iemejia] [website] Add Spark Structured Runner VR badge to the github template

[tvalentyn] [BEAM-8575] Add a Python test to test windowing in DoFn finish_bundle()

[github] [BEAM-3419] Flesh out iterable side inputs and key enumeration for

[github] common --> unique

[kcweaver] [BEAM-8795] fix Spark runner build


------------------------------------------
[...truncated 3.43 MB...]
Nov 21, 2019 11:21:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable as step s22
Nov 21, 2019 11:21:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map as step s23
Nov 21, 2019 11:21:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/tmp/staging/
Nov 21, 2019 11:21:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <66886 bytes, hash 7pCFPTgzBa8Gt0qfLs84Sw> to gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/tmp/staging/pipeline-7pCFPTgzBa8Gt0qfLs84Sw.pb
Nov 21, 2019 11:21:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.18.0-SNAPSHOT
Nov 21, 2019 11:21:25 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$DefaultGcpRegionFactory create
WARNING: Region will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones
Nov 21, 2019 11:21:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-21_03_21_25-10980024877579191298?project=apache-beam-testing
Nov 21, 2019 11:21:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2019-11-21_03_21_25-10980024877579191298
Nov 21, 2019 11:21:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2019-11-21_03_21_25-10980024877579191298
Nov 21, 2019 11:21:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:25.894Z: Autoscaling is enabled for job 2019-11-21_03_21_25-10980024877579191298. The number of workers will be between 1 and 1000.
Nov 21, 2019 11:21:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:25.895Z: Autoscaling was automatically enabled for job 2019-11-21_03_21_25-10980024877579191298.
Nov 21, 2019 11:21:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:30.416Z: Checking permissions granted to controller Service Account.
Nov 21, 2019 11:21:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:35.475Z: Worker configuration: n1-standard-1 in us-central1-a.
Nov 21, 2019 11:21:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.145Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 21, 2019 11:21:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.305Z: Expanding GroupByKey operations into optimizable parts.
Nov 21, 2019 11:21:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.340Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.543Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.582Z: Fusing consumer WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous) into WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.617Z: Fusing consumer WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map into WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.653Z: Fusing consumer WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize into WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.678Z: Fusing consumer WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key into WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.707Z: Fusing consumer WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.745Z: Fusing consumer WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.782Z: Fusing consumer WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.806Z: Fusing consumer WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.828Z: Fusing consumer WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.859Z: Fusing consumer WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map into WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.897Z: Unzipping flatten s12 for input s11.org.apache.beam.sdk.values.PCollection.<init>:400#20ff67585e33a8f6
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.926Z: Fusing unzipped copy of WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow), through flatten WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections, into producer WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.952Z: Fusing consumer WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow) into WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:36.992Z: Fusing consumer WordCount.CountWords/ParDo(ExtractWords) into ReadLines/Read
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.027Z: Fusing consumer WordCount.CountWords/Count.PerElement/Init/Map into WordCount.CountWords/ParDo(ExtractWords)
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.064Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial into WordCount.CountWords/Count.PerElement/Init/Map
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.099Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.133Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.170Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.204Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.239Z: Fusing consumer MapElements/Map into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.277Z: Fusing consumer WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign into MapElements/Map
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.312Z: Fusing consumer WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles into WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.347Z: Fusing consumer WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify into WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.380Z: Fusing consumer WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write into WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.416Z: Fusing consumer WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow into WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.450Z: Fusing consumer WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten into WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
Nov 21, 2019 11:21:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.479Z: Fusing consumer WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum into WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
Nov 21, 2019 11:21:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.861Z: Executing operation WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Nov 21, 2019 11:21:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.893Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Create
Nov 21, 2019 11:21:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.932Z: Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
Nov 21, 2019 11:21:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.943Z: Starting 1 workers in us-central1-a...
Nov 21, 2019 11:21:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:37.982Z: Finished operation WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Nov 21, 2019 11:21:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:38.003Z: Finished operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Create
Nov 21, 2019 11:21:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:38.003Z: Finished operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
Nov 21, 2019 11:21:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:21:38.189Z: Executing operation ReadLines/Read+WordCount.CountWords/ParDo(ExtractWords)+WordCount.CountWords/Count.PerElement/Init/Map+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write

> Task :runners:direct-java:runQuickstartJavaDirect
Nov 21, 2019 11:21:58 AM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
INFO: Filepattern pom.xml matched 1 files with total size 17698
Nov 21, 2019 11:21:58 AM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern pom.xml into bundles of size 1106 took 1 ms and produced 1 files and 16 bundles
Nov 21, 2019 11:22:02 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer ae590f8b-7eb1-44ee-a78e-56061ea9590b for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@241542f0 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 21, 2019 11:22:02 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer d1600162-1e05-4dd2-b182-54eafed32a1d for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@241542f0 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 21, 2019 11:22:02 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer 7520eedb-8723-437f-8b86-d5518b31e43e for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@241542f0 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 21, 2019 11:22:02 AM org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer efd35007-cc2f-46b5-9087-239c84d1e1f8 for window org.apache.beam.sdk.transforms.windowing.GlobalWindow@241542f0 pane PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} destination null
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/7520eedb-8723-437f-8b86-d5518b31e43e
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/d1600162-1e05-4dd2-b182-54eafed32a1d
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/efd35007-cc2f-46b5-9087-239c84d1e1f8
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/ae590f8b-7eb1-44ee-a78e-56061ea9590b
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/d1600162-1e05-4dd2-b182-54eafed32a1d, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@241542f0, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/counts-00003-of-00004
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/efd35007-cc2f-46b5-9087-239c84d1e1f8, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@241542f0, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/counts-00001-of-00004
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/7520eedb-8723-437f-8b86-d5518b31e43e, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@241542f0, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/counts-00000-of-00004
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles
INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/ae590f8b-7eb1-44ee-a78e-56061ea9590b, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@241542f0, paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0}} to final location /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/counts-00002-of-00004
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/efd35007-cc2f-46b5-9087-239c84d1e1f8
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/7520eedb-8723-437f-8b86-d5518b31e43e
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/d1600162-1e05-4dd2-b182-54eafed32a1d
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
INFO: Will remove known temporary file /tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/ae590f8b-7eb1-44ee-a78e-56061ea9590b
Nov 21, 2019 11:22:03 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles
WARNING: Failed to match temporary files under: [/tmp/groovy-generated-5215725800202059282-tmpdir/word-count-beam/.temp-beam-2dccae89-964d-45c8-af51-6e3da810759c/].
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow
Nov 21, 2019 11:22:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2019-11-21T11:22:02.943Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 21, 2019 11:22:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:22:09.503Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
Nov 21, 2019 11:22:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:22:25.374Z: Workers have started successfully.
Nov 21, 2019 11:22:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:22:25.396Z: Workers have started successfully.
Nov 21, 2019 11:24:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:03.978Z: Finished operation ReadLines/Read+WordCount.CountWords/ParDo(ExtractWords)+WordCount.CountWords/Count.PerElement/Init/Map+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write
Nov 21, 2019 11:24:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:04.091Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
Nov 21, 2019 11:24:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:04.139Z: Finished operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
Nov 21, 2019 11:24:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:04.202Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Nov 21, 2019 11:24:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:15.277Z: Finished operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Nov 21, 2019 11:24:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:15.400Z: Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
Nov 21, 2019 11:24:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:15.456Z: Finished operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
Nov 21, 2019 11:24:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:15.532Z: Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
Nov 21, 2019 11:24:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:18.282Z: Finished operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
Nov 21, 2019 11:24:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:18.437Z: Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections/Unzipped-1
Nov 21, 2019 11:24:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:18.539Z: Finished operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections/Unzipped-1
Nov 21, 2019 11:24:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:18.675Z: Executing operation WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/CreateDataflowView
Nov 21, 2019 11:24:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:18.722Z: Finished operation WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/CreateDataflowView
Nov 21, 2019 11:24:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:18.857Z: Executing operation WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map+WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Nov 21, 2019 11:24:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:23.524Z: Finished operation WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map+WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Nov 21, 2019 11:24:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:23.607Z: Executing operation WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Nov 21, 2019 11:24:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:23.653Z: Finished operation WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Nov 21, 2019 11:24:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:23.706Z: Executing operation WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 21, 2019 11:24:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:26.850Z: Finished operation WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 21, 2019 11:24:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:27.032Z: Cleaning up.
Nov 21, 2019 11:24:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:24:27.101Z: Stopping worker pool...
Nov 21, 2019 11:27:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:27:16.367Z: Autoscaling: Resized worker pool from 1 to 0.
Nov 21, 2019 11:27:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-21T11:27:16.399Z: Worker pool stopped.
Nov 21, 2019 11:27:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2019-11-21_03_21_25-10980024877579191298 finished with status DONE.
gsutil cat gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/count* | grep Montague:
Montague: 47
Verified Montague: 47
gsutil rm gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/count*
Removing gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/counts-00000-of-00003...
/ [1 objects]                                                                   
Removing gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/counts-00001-of-00003...
/ [2 objects]                                                                   
Removing gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/counts-00002-of-00003...
/ [3 objects]                                                                   
Operation completed over 3 objects.                                              
[SUCCESS]

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 8s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/nehmctsthclme

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostRelease_NightlySnapshot #807

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/807/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org