You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/01/15 12:33:56 UTC

Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #2

See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/2/display/redirect?page=changes>

Changes:

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes from

[mzobii.baig] Beam-2535 : Replace timeStamp with outputTimeStamp

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] Beam-2535 : Pass outputTimestamp param in onTimer method

[mzobii.baig] Beam-2535 : Minor changed

[rehman.muradali] [BEAM-2535] : Add Commit State in ParDoEvaluator

[rehman.muradali] [BEAM-2535] : Add outputTimestamp in compare method, Revert

[mzobii.baig] Beam-2535 : Modifying default minimum target and GC time

[rehman.muradali] BEAM-2535 : Removal of extra lines

[mzobii.baig] Beam-2535 : Proposed changes

[mzobii.baig] Beam-2535 : Added original PR watermark hold functionality.

[rehman.muradali] [BEAM-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Variable renaming and added output timestamp in

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] [Beam-2535] Modify test case

[mzobii.baig] [Beam-2535] Added comments

[mzobii.baig] [Beam-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Set Processing Time with outputTimestamp

[mzobii.baig] [Beam-2535] Minor renaming

[rehman.muradali] [BEAM-2535] Revert Processing Time, Addition of OutputTimestamp

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[kirillkozlov] Modify AggregateProjectMergeRule to have a condition

[ehudm] [BEAM-8269] Convert from_callable type hints to Beam types

[kirillkozlov] SpotlesApply

[kirillkozlov] Test for a query with a predicate

[kirillkozlov] A list of visited nodes should be unique per onMatch invocation

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[kirillkozlov] Make sure all nodes are explored

[dcavazos] [BEAM-7390] Add code snippet for Min

[dcavazos] [BEAM-7390] Add code snippet for Max

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[kirillkozlov] Add a new Jenkins job for SQL perf tests

[kirillkozlov] Test boilerplate

[rehman.muradali] Adding OutputTimestamp in Timer Object

[rehman.muradali] Apply Spotless and checkstyle

[kirillkozlov] Table proxy to add TimeMonitor after the IO

[kirillkozlov] Tests for direct_read w/o push-down and default methods

[mzobii.baig] [Beam-2535] Added watermark functionality for the dataflow runner

[kenn] Use more informative assertions in some py tests

[mzobii.baig] [Beam-2535] Used boolean instead boxed type

[kirillkozlov] Cleanup

[dcavazos] [BEAM-7390] Add code snippet for Sum

[mzobii.baig] [Beam-2535] Modify required watermark hold functionality

[kirillkozlov] Monitor total number of fields read from an IO

[ehudm] Fix _get_args for typing.Tuple in Py3.5.2

[kcweaver] Add FlinkMiniClusterEntryPoint for testing the uber jar submission

[kcweaver] [BEAM-8512] Add integration tests for flink_runner.py.

[kcweaver] Build mini cluster jar excluding unwanted classes.

[kcweaver] Rename to testFlinkUberJarPyX.Y

[kcweaver] Increase timeout on beam_PostCommit_PortableJar_Flink.

[kamil.wasilewski] [BEAM-1440] Provide functions for waiting for BQ job and exporting

[kamil.wasilewski] [BEAM-1440] Create _BigQuerySource that implements iobase.BoundedSource

[kamil.wasilewski] [BEAM-1440] Reorganised BigQuery read IT tests

[kamil.wasilewski] [BEAM-1440] Create postCommitIT jobs running on Flink Runner

[kamil.wasilewski] [BEAM-1440] Convert strings to bytes on Python 3 if field type is BYTES

[kamil.wasilewski] [BEAM-1440]: Support RECORD fields in coder

[kamil.wasilewski] [BEAM-1440] Remove json files after reading

[kamil.wasilewski] [BEAM-1440] Marked classes as private

[kamil.wasilewski] [BEAM-1440] Do not force to create temp dataset when using dry run

[echauchot] [BEAM-5192] Migrate ElasticsearchIO to v7

[echauchot] [BEAM-5192] Minor change of ESIO public configuration API:

[robinyqiu] BeamZetaSqlCalcRel prototype

[valentyn] Install SDK after tarball is generated to avoid a race in proto stubs

[kamil.wasilewski] [BEAM-8671] Add Python 3.7 support for LoadTestBuilder

[kamil.wasilewski] [BEAM-8671] Add ParDo test running on Python 3.7

[ehudm] Fix cleanPython race with :clean

[robinyqiu] Fix bug in SingleRowScanConverter

[robinyqiu] Use BeamBigQuerySqlDialect

[boyuanz] [BEAM-8536] Migrate using requested_execution_time to

[pabloem] Initialize logging configuration in Pipeline object

[daniel.o.programmer] [BEAM-7970] Touch-up on Go protobuf generation instructions.

[kamil.wasilewski] [BEAM-8979] Remove mypy-protobuf dependency

[echauchot] [BEAM-5192] Fix missing ifs for ES7 specificities.

[echauchot] [BEAM-5192] Remove unneeded transitive dependencies, upgrade ES and

[echauchot] [BEAM-5192] Disable MockHttpTransport plugin to enabe http dialog to

[mikhail] Update release docs

[relax] Merge pull request #10311: [BEAM-8810] Detect stuck commits in

[kcweaver] Import freezegun for Python time testing.

[kcweaver] Allow message stream to yield duplicates.

[mikhail] Blogpost stub

[kcweaver] [BEAM-8891] Create and submit Spark portable jar in Python.

[kcweaver] [BEAM-8296] containerize spark job server

[robinyqiu] Address comments

[github] [GoSDK] Make data channel splits idempotent (#10406)

[pabloem] Initialize logging configuration in PipelineOptions object.

[rehman.muradali] EarliestTimestamp Fix for outputTimestamp

[lukasz.gajowy] [BEAM-5495] Make PipelineResourcesDetectorAbstractFactory an inner

[lukasz.gajowy] [BEAM-5495] Change detect() return type to List

[lukasz.gajowy] [BEAM-5495] Minor docs and test fixes

[mxm] [BEAM-8959] Invert metrics flag in Flink Runner

[lukasz.gajowy] [BEAM-5495] Re-add test verifying order of resources detection

[echauchot] [BEAM-5192] Fix util class, elasticsearch changed their json output of

[mikhail] Add blogpost file

[tysonjh] CachingShuffleBatchReader use bytes to limit size.

[mikhail] Add blogpost highlights

[github] Update release guide for cherry picks (#10399)

[heejong] [BEAM-8902] parameterize input type of Java external transform

[lukasz.gajowy] [BEAM-5495] Prevent nested jar scanning (jarfiles in jarfiles)

[ehudm] Dicts are not valid DoFn.process return values

[github] Update release notes version to correct one.

[valentyn] Sickbay VR tests that don't pass

[chamikara] Makes environment ID a top level attribute of PTransform.

[angoenka] [BEAM-8944] Change to use single thread in py sdk bundle progress report

[aaltay] [BEAM-8335] Background caching job (#10405)

[ehudm] Light cleanup of opcodes.py

[chamikara] Setting environment ID for ParDo and Combine transforms

[pawel.pasterz] [BEAM-8978] Publish table size of data written during HadoopFormatIOIT

[echauchot] [BEAM-5192] Set a custom json serializer for document metadata to be

[echauchot] [BEAM-5192] Remove testWritePartialUpdateWithErrors because triggering

[sunjincheng121] [BEAM-7949] Add time-based cache threshold support in the data service

[mxm] [BEAM-8996] Auto-generate pipeline options documentation for FlinkRunner

[mxm] Regenerate Flink options table with the latest master

[mxm] [BEAM-8996] Improvements to the Flink runner page

[ehudm] Upgrade parameterized version to 0.7.0+

[kawaigin] [BEAM-8977] Resolve test flakiness

[dpcollins] Modify PubsubClient to use the proto message throughout.

[lukecwik] [BEAM-9004] Migrate org.mockito.Matchers#anyString to

[suztomo] GenericJsonAssert

[suztomo] Refactoring with assertEqualsAsJson

[chamikara] Fixes Go formatting.

[robertwb] [BEAM-8335] Add a TestStreamService Python Implementation (#10120)

[lukecwik] Minor cleanup of tests using TestStream. (#10188)

[pabloem] [BEAM-2572] Python SDK S3 Filesystem (#9955)

[github] [BEAM-8974] Wait for log messages to be processed before checking them.

[sunjincheng121] [BEAM-7949] Introduce PeriodicThread for time-based cache threshold

[github] Merge pull request #10356: [BEAM-7274] Infer a Beam Schema from a

[echauchot] [BEAM-9019] Improve Encoders: replace as much as possible of catalyst

[lukecwik] [BEAM-8623] Add status_endpoint field to provision api ProvisionInfo

[github] [BEAM-8999] Respect timestamp combiners in PGBKCVOperation. (#10425)

[github] Update dataflow container images to beam-master-20191220 (#10448)

[zyichi] [BEAM-8824] Add support to allow specify window allowed_lateness in

[chamikara] Adds documentation for environment_id fields.

[lukecwik] [BEAM-8846] Update documentation about stream observers and factories,

[apilloud] [BEAM-9023] Upgrade to ZetaSQL 2019.12.1

[lukecwik] [BEAM-7951] Allow runner to configure customization WindowedValue coder.

[chamikara] Sets missing environmentId in several locations.

[bhulette] [BEAM-8988] RangeTracker for _CustomBigQuerySource (#10412)

[ehudm] Set TMPDIR for tox environments

[robinyqiu] Address comments

[robinyqiu] Address comments

[kcweaver] Refactor shared uber jar generation code into common subclass.

[sunjincheng121] [BEAM-8935] Fail fast if sdk harness startup failed.

[echauchot] [BEAM-5192] use <= and >= in version specific code instead of == to be

[relax] Merge pull request #10444: [BEAM-9010] Proper TableRow size calculation

[github] Merge pull request #10449: [BEAM-7274] Implement the Protobuf schema

[sunjincheng121] [BEAM-9030] Bump grpc to 1.26.0

[github] Python example parameters fix

[bhulette] [BEAM-9026] Clean up RuntimeValueProvider.runtime_options (#10457)

[kirillkozlov] Fix BytesValue unparsing

[kirillkozlov] Fix floating point literals

[kirillkozlov] Fix string literals

[kirillkozlov] Add null check for SqlTypeFamily

[kirillkozlov] ZetaSqlCalcRule should be disaled by defualt

[kirillkozlov] spotles

[ehudm] [BEAM-9025] Update Dataflow Java container worker

[heejong] [BEAM-9034] Update environment_id for ExternalTransform in Python SDK

[sunjincheng121] [BEAM-9030] Update the dependencies to make sure the dependency linkage

[mxm] [BEAM-8962] Report Flink metric accumulator only when pipeline ends

[github] Revert "[BEAM-8932]  Modify PubsubClient to use the proto message

[ehudm] junitxml_report: Add failure tag support

[github] Catch __module__ is None.

[relax] Merge pull request #10422: [BEAM-2535] TimerData signature update

[rehman.muradali] Rebase TimerData PR

[udim] [BEAM-9012] Change __init__ hints so they work with pytype (#10466)

[github] [BEAM-9039] Fix race on reading channel readErr. (#10456)

[lcwik] [BEAM-5605] Increase precision of fraction used during splitting.

[github] [BEAM-8487] Convert forward references to Any (#9888)

[lukecwik] [BEAM-9020] LengthPrefixUnknownCodersTest to avoid relying on

[lukecwik] [BEAM-7951] Improve the docs for beam_runner_api.proto and

[sunjincheng121] [BEAM-9006] Improve ProcessManager for shutdown hook handling.

[kamil.wasilewski] [BEAM-8671] Fix Python 3.7 ParDo test job name

[github] [BEAM-5600] Add unimplemented split API to Runner side SDF libraries.

[github] [BEAM-5605] Fix type used to describe channel splits to match type used

[github] [BEAM-5605] Ensure that split calls are routed to the active bundle

[suztomo] protobuf 3.11.1

[jeff] BEAM-8745 More fine-grained controls for the size of a BigQuery Load job

[kcweaver] Make Spark REST URL a separate pipeline option.

[kirillkozlov] Address comments

[aaltay] [BEAM-8335] On Unbounded Source change (#10442)

[aaltay] [BEAM-9013] TestStream fix for DataflowRunner (#10445)

[angoenka] [BEAM-8575] Refactor test_do_fn_with_windowing_in_finish_bundle to work

[sunjincheng121] [BEAM-9055] Unify the config names of Fn Data API across languages.

[rehman.muradali] onTimer/setTimer signature updates

[sunjincheng121] fixup

[davidsabater] [BEAM-9053] Improve error message when unable to get the correct

[mxm] [BEAM-8577] Initialize FileSystems during Coder deserialization in

[github] Update _posts_2019-12-16-beam-2.17.0.md

[github] Cleanup formatting.

[suztomo] google_auth_version 0.19.0

[github] Update release date.

[lcwik] [BEAM-9059] Migrate PTransformTranslation to use string constants

[iemejia] [BEAM-5546] Update commons-codec to version 1.14

[iemejia] [BEAM-8701] Remove unused commons-io_1x dependency

[iemejia] [BEAM-8701] Update commons-io to version 2.6

[iemejia] [BEAM-5544] Update cassandra-all dependency to version 3.11.5

[iemejia] [BEAM-8749] Update cassandra-driver-mapping to version 3.8.0

[mxm] Rename FlinkClassloading to Workarounds

[mxm] [BEAM-9060] Restore stdout/stderr in case Flink's

[echauchot] Fix link in javadoc to accumulators

[github] Restrict the upper bound for pyhamcrest, since new version does not work

[apilloud] [BEAM-9027] [SQL] Fix ZetaSQL Byte Literals

[github] [BEAM-9058] Fix line-too-long exclusion regex and re-enable

[altay] Readability/Lint fixes

[hannahjiang] BEAM-8780 reuse RC images instead of recreate images

[iemejia] [BEAM-8716] Update commons-csv to version 1.7

[iemejia] [BEAM-9041] Add missing equals methods for GenericRecord <-> Row

[iemejia] [BEAM-9042] Fix RowToGenericRecordFn Avro schema serialization

[iemejia] [BEAM-9042] Update SchemaCoder doc with info about functions requiring

[iemejia] [BEAM-9042] Test serializability and equality of Row<->GenericRecord

[tvalentyn] [BEAM-9062] Improve assertion error for equal_to (#10504)

[iemejia] [BEAM-8717] Update commons-lang3 to version 3.9

[iemejia] [BEAM-8717] Make non core modules use only the repackaged commons-lang3

[chamikara] [BEAM-8960]: Add an option for user to opt out of using insert id for

[ehudm] Small fixes to verify_release_build.sh

[kirillkozlov] Metric name should not be constant

[36090911+boyuanzz] [BEAM-8932] [BEAM-9036] Revert reverted commit to use PubsubMessage as

[sunjincheng121] fixup

[github] Update ParDoTest.java

[rehman.muradali] Apply spotless

[rehman.muradali] Compilation Fix PardoTest

[rehman.muradali] Reverting outputTimestamp validation

[rehman.muradali] CheckStyle Fix

[rehman.muradali] Adding Category to exclude Flink Runner

[jkai] [BEAM-8496] remove SDF translators in flink streaming transform

[github] Fix blogpost typo (#10532)

[kcweaver] [BEAM-9070] tests use absolute paths for job server jars

[12602502+Ardagan] Fix headings in downloads.md

[github] Add # pytype: skip-file before first import statement in each py file

[apilloud] [BEAM-9027] Unparse DOY/DOW/WEEK Enums properly for ZetaSQL

[33895511+aromanenko-dev] [BEAM-8953] Extend ParquetIO read builders for AvroParquetReader

[brad.g.west] [BEAM-9078] Pass total_size to storage.Upload

[hannahjiang] BEAM-7861 add direct_running_mode option

[github] [BEAM-9075] Disable JoinCommuteRule for ZetaSQL planner (#10542)

[bhulette] [BEAM-9075] add a test case. (#10545)

[12602502+Ardagan] [BEAM-8821] Document Python SDK 2.17.0 deps (#10212)

[kirillkozlov] Missing commit

[hannahjiang] [BEAM-7861] rephrase direct_running_mode option checking

[kcweaver] [BEAM-8337] Hard-code Flink versions.

[echauchot] [BEAM-9019] Remove BeamCoderWrapper to avoid extra object allocation and

[lukecwik] [BEAM-8624] Implement Worker Status FnService in Dataflow runner

[github] [BEAM-5605] Add support for executing pair with restriction, split

[kcweaver] fix indentation

[kcweaver] Update release guide

[lostluck] [BEAM-9080] Support KVs in the Go SDK's Partition

[github] Rephrasing lull logging to avoid alarming users (#10446)

[robertwb] [BEAM-8575] Added counter tests for CombineFn (#10190)

[github] [BEAM-8490] Fix instance_to_type for empty containers (#9894)

[apilloud] [BEAM-8630] Use column numbers for BeamZetaSqlCalRel

[apilloud] [BEAM-9027] Backport BigQuerySqlDialect fixes

[robertwb] [BEAM-8575] Test hot-key fanout with accumulation modes. (#10159)

[github] [BEAM-9059] Use string constants in PTransformTranslation instead of

[iemejia] [BEAM-8956] Begin unifying contributor instructions into a single

[pawel.pasterz] [BEAM-7115] Fix metrics being incorrectly gathered

[mxm] Remove incorrectly tagged test annotation from test case

[mxm] [BEAM-6008] Propagate errors during pipeline execution in Java's

[github] Tighten language and remove distracting link

[pabloem] [BEAM-7390] Add code snippet for Top (#10179)

[bhulette] [BEAM-8993] [SQL] MongoDB predicate push down. (#10417)

[lukecwik] [BEAM-8740] Remove unused dependency from Spark runner (#10564)

[robertwb] [BEAM-6587] Remove hacks due to missing common string coder.

[kirillkozlov] Update data source for SQL performance tests

[github] [BEAM-5605] Add support for channel splitting to the gRPC read "source"

[github] [BEAM-5605] Add support for additional parameters to SplittableDofn

[chadrik] [BEAM-7746] Address changes in code since annotations were introduced

[chadrik] [BEAM-7746]  Typing fixes that require runtime code changes

[chadrik] [BEAM-7746] Avoid creating attributes dynamically, so that they can be

[chadrik] [BEAM-7746] Bugfix: coder id is expected to be str in python3

[chadrik] [BEAM-7746] Explicitly unpack tuple to avoid inferring unbounded tuple

[chadrik] [BEAM-7746] Generate files with protobuf urns as part of gen_protos

[chadrik] [BEAM-7746] Move name and coder to base StateSpec class

[chadrik] [BEAM-7746] Remove reference to missing attribute in

[chadrik] [BEAM-7746] Non-Optional arguments cannot default to None

[chadrik] [BEAM-7746] Avoid reusing variables with different data types

[chadrik] [BEAM-7746] Add StateHandler abstract base class

[chadrik] [BEAM-7746] Add TODO about fixing assignment to

[chadrik] [BEAM-7746] Fix functions that were defined twice

[chadrik] [BEAM-7746] Fix tests that have the same name

[iemejia] [BEAM-9040] Add skipQueries option to skip queries in a Nexmark suite

[iemejia] [BEAM-9040] Add Spark Structured Streaming Runner to Nexmark PostCommit

[valentyn] Switch to unittest.SkipTest instead of using nose.

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and

[chamikara] Sets the correct coder when clustering is enabled for the

[robertwb] Always initalize output processor on construction.

[github] [Go SDK Doc] Update Dead Container Link (#10585)

[github] Merge pull request #10582 for [INFRA-19670] Add .asf.yaml for Github


------------------------------------------
[...truncated 72.32 KB...]
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                -Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
- [2 files][  6.0 KiB/ 13.4 KiB]                                                - [3 files][ 13.4 KiB/ 13.4 KiB]                                                
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=17
+ gcloud dataproc clusters create streaming-2 --region=global --num-workers=17 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/e34295eb-ece1-3713-82e9-49349c987993].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
..........................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/streaming-2] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@streaming-2-m '--command=yarn application -list'
++ grep streaming-2
Warning: Permanently added 'compute.120247318236805896' (ECDSA) to the list of known hosts.
20/01/15 12:33:39 INFO client.RMProxy: Connecting to ResourceManager at streaming-2-m/10.128.1.115:8032
+ read line
+ echo application_1579091538647_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://streaming-2-w-14.c.apache-beam-testing.internal:33777
application_1579091538647_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://streaming-2-w-14.c.apache-beam-testing.internal:33777
++ echo application_1579091538647_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://streaming-2-w-14.c.apache-beam-testing.internal:33777
++ sed 's/ .*//'
+ application_ids[$i]=application_1579091538647_0001
++ echo application_1579091538647_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://streaming-2-w-14.c.apache-beam-testing.internal:33777
++ sed 's/.*streaming-2/streaming-2/'
++ sed 's/ .*//'
+ application_masters[$i]=streaming-2-w-14.c.apache-beam-testing.internal:33777
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=streaming-2-w-14.c.apache-beam-testing.internal:33777
+ echo 'Using Yarn Application master: streaming-2-w-14.c.apache-beam-testing.internal:33777'
Using Yarn Application master: streaming-2-w-14.c.apache-beam-testing.internal:33777
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@streaming-2-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=streaming-2-w-14.c.apache-beam-testing.internal:33777 --artifacts-dir=gs://beam-flink-cluster/streaming-2'
c3cdee9cac30aa5d640768cff906facb6f64eb65323984be7b6b958c045f8179
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@streaming-2-m '--command=curl -s "http://streaming-2-w-14.c.apache-beam-testing.internal:33777/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1579091538647_0001"},{"key":"jobmanager.rpc.address","value":"streaming-2-w-14.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-87e0051e-f012-447d-92c3-30911aa38c5d"},{"key":"jobmanager.rpc.port","value":"45747"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1579091538647_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"streaming-2-w-14.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo streaming-2-w-14.c.apache-beam-testing.internal:33777
++ cut -d : -f1
+ local yarn_application_master_host=streaming-2-w-14.c.apache-beam-testing.internal
++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1579091538647_0001"},{"key":"jobmanager.rpc.address","value":"streaming-2-w-14.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-87e0051e-f012-447d-92c3-30911aa38c5d"},{"key":"jobmanager.rpc.port","value":"45747"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1579091538647_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"streaming-2-w-14.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=45747
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@streaming-2-m -- -L 8081:streaming-2-w-14.c.apache-beam-testing.internal:33777 -L 45747:streaming-2-w-14.c.apache-beam-testing.internal:45747 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@streaming-2-m -- -L 8081:streaming-2-w-14.c.apache-beam-testing.internal:33777 -L 45747:streaming-2-w-14.c.apache-beam-testing.internal:45747 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@streaming-2-m -- -L 8081:streaming-2-w-14.c.apache-beam-testing.internal:33777 -L 45747:streaming-2-w-14.c.apache-beam-testing.internal:45747 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins5845669632463696402.sh
+ echo src Load test: fanout 4 times with 2GB 10-byte records total on Flink in Portable mode src
src Load test: fanout 4 times with 2GB 10-byte records total on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_4 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_4 --sourceOptions={"numRecords":5000000,"keySizeBytes":10,"valueSizeBytes":90} --fanout=4 --iterations=1 --topCount=20 --sdkWorkerParallelism=16 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :runners:local-java:jar
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.IllegalStateException: GroupByKey cannot be applied to non-bounded PCollection in the GlobalWindow without a trigger. Use a Window.into or Window.triggering transform prior to GroupByKey.
	at org.apache.beam.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:156)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:226)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:473)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1596)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1485)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.loadTest(CombineLoadTest.java:134)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:96)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5s
61 actionable tasks: 8 executed, 7 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/373uiefrcaoik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #14

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/14/display/redirect?page=changes>

Changes:

[zyichi] Update BigQuery source in bigquery_tornadoes example

[ehudm] [BEAM-9398] runtime_type_check: support setup

[relax] switch cogbk to use Beam transform

[relax] finish join

[robertwb] [BEAM-9496] Evaluation of deferred dataframes via Beam operations.

[relax] support side-input joins

[relax] support side-input joins

[relax] spotless

[kcweaver] [BEAM-9509] Improve error message for bad job server URL.

[relax] make FieldAccessDescriptor always be field-insertion order

[lcwik] [BEAM-9339, BEAM-2939] Drop splittable field from proto, add splittable

[relax] fix side-input joins

[relax] fix bug

[relax] remove obsolete test

[relax] add javadoc

[rohde.samuel] Add dependency comment in streaming cache

[robertwb] Fix and test tuple inputs and outputs.

[kawaigin] Remove the excessive logging from capturable sources property.

[ehudm] [BEAM-8280] Enable type hint annotations

[piotr.szuberski] [BEAM-9563] Change ToListCombineFn access level to private

[boyuanz] Add Timer to Elements proto representation.

[robertwb] [BEAM-9340] Plumb requirements through Java SDK.

[robertwb] [BEAM-9340] Populate requirements for Java DoFn properties.

[daniel.o.programmer] [BEAM-3301] Bugfix in DoFn validation.

[robertwb] [BEAM-9558] Add an explicit end field to the data channel protos.

[robertwb] [BEAM-9558] Regenerate go protos.

[robertwb] [BEAM-9558] Produce and respect data channel end bit in runners and

[github] Merge pull request #11153 from [BEAM-9537] Adding a new module for

[kcweaver] [BEAM-9446] Retain unknown arguments when using uber jar job server.

[lcwik] [BEAM-9565] Fix threading issue with WatermarkEstimatorsTest

[relax] add unit tests

[relax] update sql transform

[lcwik] [BEAM-9430] Fix coder sent to Dataflow service for non-portable

[kcweaver] Enable '--option=value' and '-option' syntax.

[github] Merge pull request #10990: [BEAM-9569] disable coder inference for rows

[rohde.samuel] Adds a streaming wordcount integration test

[rohde.samuel] Address leftover styling comments from PR/10892

[github] optionally import grpc (#11187)

[github] [BEAM-9305] Allow value provider query strings in _CustomBigQuerySource

[rohde.samuel] address comments

[github] Merge pull request #11198 from [BEAM-7923] Obfuscates display ids

[github] Merge pull request #11074: Store logical type values in Row instead of

[piotr.szuberski] [BEAM-9507] Fix python dependency check task

[mxm] [BEAM-9573] Correct computing of watermark hold for timer output

[github] [BEAM-7923] Pop failed transform when error is raised (#11174)

[pabloem] Fixing bqtest

[suztomo] google-api-client 1.30.9

[github] [BEAM-9579] Fix numpy logic operators (#11204)

[mxm] [BEAM-9580] Downgrade Flink version to 1.9 for Nexmark and

[github] Merge pull request #11163 from [BEAM-9548] Add better error handling to

[daniel.o.programmer] [BEAM-3301] Adding restriction trackers and validation.

[ehudm] [BEAM-8280] Type hints via annotations snippets

[rohde.samuel] changed data to be less repetitive

[github] Merge pull request #11215 from [BEAM-9601] unbreaking precommits

[pabloem] Starting refactor of FnApiRunner

[pabloem] fixup

[pabloem] Fix lint issues

[pabloem] Creating file with worker handlers

[pabloem] Fixup

[pabloem] Fixing lint. Formatting

[pabloem] Fixup

[jozo.vilcek] [BEAM-9420] Configurable timeout for blocking kafka API call(s)

[piotr.szuberski] [BEAM-9606] Add missing parameters in combine_test.py gradle example

[github] Go changes for model updates. (#11211)

[alex] [BEAM-9605] BIP-1: Rename setRowOption to setOption on Option builder

[robertwb] Add base SDK version to environment capabilities for Python and Java.

[apilloud] [BEAM-9609] Upgrade to ZetaSQL 2020.03.2

[robertwb] [BEAM-9340] Validate pipeline requirements in PipelineValidator.

[pabloem] Renaming method to be more appropriate

[daniel.o.programmer] [BEAM-3301] Fix another bug in DoFn validation, in exec.

[github] fix formatter

[github] [BEAM-8078] streaming_wordcount_debugging.py is missing a test (#10914)

[rohde.samuel] [BEAM-9601] Skip the streaming wordcount test because it uses a

[github] Update Go Protos (#11230)

[github] add @RequiresTimeSortedInput to CHANGES.md (#11228)

[github] Add notes to change log.

[ehudm] [BEAM-8078] Disable test_streaming_wordcount_it

[zyichi] [BEAM-9263] Bump up python sdk dataflow environment major versions

[github] [BEAM-9371] Add SideInputLoadTest to Java SDK (#10949)

[github] [BEAM-7505] Add side input load test to Python SDK  (#11136)

[mxm] [BEAM-9566] Mitigate performance issue for output timestamp watermark

[robertwb] [BEAM-9614] Add SDK id for go.

[ehudm] [BEAM-8078] Disable test_streaming_wordcount_debugging_it

[github] [BEAM-9495] Make DataCatalogTableProvider AutoCloseable (#11116)

[ehudm] [BEAM-8466] Make strip_iterable more strict

[robertwb] [BEAM-4150] Use explicit map for data channel coders.

[robertwb] [BEAM-4150] Don't window PCollection coders.

[ehudm] [BEAM-5422] Document DynamicDestinations.getTable uniqueness requirement

[ehudm] [BEAM-1894] Remove obsolete EagerRunner test

[github] [BEAM-9574] Ensure that instances of generated namedtuple classes can be

[github] Update the range for pyarrow

[boyuanz] Remove TimeSpec from proto

[github] add missing bracket

[alex] [BEAM-9605] BIP-1: Rename setRowOption to setOption on Option builder

[lcwik] [BEAM-4374] Update protos related to MonitoringInfo.

[github] Update the Go SDK roadmap for April 2020 (#11239)

[github] Merge pull request #10883: [BEAM-9331] Add better Row builders

[github] [BEAM-8292] Portable Reshuffle for Go SDK (#11197)

[robertwb] Side input signals for legacy worker.

[lostluck] accept generated metrics

[github] Use split instead of rsplit

[lostluck] Remove mType and move type urns to urnToType

[lostluck] add missing pcollection payload

[github] [BEAM-9557] Fix timer window boundary checking (#11252)

[github] Update documentation

[echauchot] [BEAM-5980] Change load-tests build to include spark-sql for spark

[echauchot] [BEAM-9436] avoid one flatmap step and a KV creation per element by

[jozsi] Update Jet version to 4.0

[apilloud] [BEAM-9512] Map anonymous structs to schema

[kawaigin] [BEAM-7923] Fix datatable on notebook reloading

[kyoungha] [BEAM-9325] Added Proper Write Method in UnownedOutputStream

[kyoungha] fix format warning

[valentyn] Fix a Py2/3 incompatibility in profiler.

[kcweaver] [BEAM-9638] Strengthen worker region & zone options tests.

[samuelw] [BEAM-9399] Change the redirection of System.err to be a custom

[kyoungha] [BEAM-9325] reflect comment : inline testing methods

[robinyqiu] Clean up code in ZetaSQLDialectSpecTest

[boyuanz] [BEAM-9454] Add Deduplication PTransform

[robertwb] [BEAM-9577] Rename the Artifact{Staging,Retrieval}Service.

[robertwb] [BEAM-9577] Define the new Artifact{Staging,Retrieval}Service.

[robertwb] [BEAM-9577] Regenerate protos.

[jozsi] Update Jet Runner web page with info about 4.0

[jozsi] Add Beam-Jet compatibility table

[robertwb] [BEAM-9577] Implement the new Artifact{Staging,Retrieval}Services in

[samuelw] Fix missing test import

[kyoungha] [BEAM-9325] reflect comment : Fix JAXBCoder + change test

[chamikara] Refactors X-Lang test pipelines.

[robertwb] [BEAM-9340] Populate requirement for timer families.

[kcweaver] [BEAM-9199] Require Dataflow --region in Python SDK.

[kcweaver] Add --region to tests where needed.

[kcweaver] [BEAM-9199] Require --region option for Dataflow in Java SDK.

[kcweaver] Add --region to Java GCP tests.

[pabloem] [BEAM-9608] Increase reliance on Context Managers for FnApiRunner

[pabloem] Revert "Merge pull request #11104 from y1chi/update_tornado_test"

[daniel.o.programmer] [BEAM-9642] Create runtime invokers for SDF methods.

[lcwik] [BEAM-9668] Disable tests till Dataflow containers are updated.

[github] [BEAM-9652] Ensure that the multipartition write sets the correct coder

[github] [BEAM-8889]add experiment flag use_grpc_for_gcs (#11183)

[robertwb] [BEAM-9322] [BEAM-1833] Better naming for composite transform output

[ameihm] [BEAM-9476] KinesisIO retry LimitExceededException

[github] [BEAM-7923] An indicator of progress in notebooks (#11276)

[robertwb] [BEAM-9577] Add dependency information to provision info.

[robertwb] Update go protos.

[lcwik] [BEAM-9677] Fix path -> url typo in ArtifactUrlPayload

[lcwik] [BEAM-9562] Update missed TimerSpec conversion in Go SDK

[kcweaver] Fix DataflowRunnerTest.

[kcweaver] Fix more Java unit tests missing --region.

[github] [BEAM-9667] Allow metrics in DoFn Setup (#11287)

[amaliujia] add 2.20.0 blog post

[kcweaver] Add --region to DF streaming example tests.

[github] [BEAM-9624] Adds Convert to Accumulators operator for use in combiner

[github] Fix minor typo

[github] Fix minor typo

[github] Merge pull request #11290: [BEAM-9670] Fix nullability widening in

[iemejia] [BEAM-9686] Get default TmpCheckpointDir value from PipelineOptions

[github] [BEAM-4374] Short IDs for the Python SDK (#11286)

[spoorti] [BEAM-9660]: Add an explicit check for integer overflow.

[github] [BEAM-9136]Add licenses for dependencies for Python (#11067)

[boyuanz] Populate source data from SDF

[boyuanz] Update Timer encoding

[mxm] [BEAM-9645] Fix premature removal of Docker container and logs

[alex] [BEAM-9044] Protobuf options to Schema options

[kcweaver] Add unit tests for get_default_gcp_region

[kcweaver] Add --region to Dataflow runner webpage.

[mxm] [BEAM-8201] Cleanup FnServices from DockerEnvironmentFactory and

[kcweaver] lint

[kcweaver] Add --region to more Java tests and examples.

[kcweaver] Add --region to more Python tests and examples.

[kcweaver] format

[robertwb] [BEAM-9577] Update container boot code to stage from dependencies, if

[rohde.samuel] Change delimeter to a dash as it is a reserved symbol in Windows

[valentyn] Fixes platform-dependent assumptions in subprocess_server_test.py.

[valentyn] Switches a test helper to a Py3-version thereof.

[github] Apply suggestions from code review

[robertwb] Use pointer recievers.

[aldaircr] Change: Fixing typos on javadoc

[robertwb] Attempt to stage resources via new API in portable runner.

[robertwb] ResolveArtifact -> ResolveArtifacts

[robertwb] Regenerate protos.

[veblush] Upgrades gcsio to 2.1.2

[github] Merge pull request #11259: Use attachValues in SQL

[alex] Add Beam Schema Options to changelog

[iemejia] [website] Update information about Beam's LTS policy

[alex] [BEAM-9704] Deprecate FieldType metadata

[eekkaaadrian] [BEAM-9705] Go sdk add value length validation checking on write to

[kcweaver] Remove unrecognized --region option from non-DF tests.

[robertwb] [BEAM-9618] Add protocol for requesting process bundle descriptors.

[robertwb] [BEAM-9618] Update Python to support process bundle descriptor fetching.

[robertwb] [BEAM-9618] Java FnApiClient support for process bundle descriptor

[github] [BEAM-8019] Python SDK support for cross-langauge pipelines in Dataflow.

[robertwb] Typo fix.

[github] remove nose (#11307)

[lcwik] [BEAM-4374, BEAM-6189] Delete and remove deprecated Metrics proto

[ecapoccia] [BEAM-9434] Improve Spark runner reshuffle translation to maximize

[github] [BEAM-9685] remove Go SDK container from release process (#11308)

[kcweaver] [BEAM-9716] Alias zone to worker_zone and warn user.

[github] Merge pull request #11226: [BEAM-9557] Fix timer window boundary

[github] Merge pull request #11244 from [BEAM-3097] _ReadFromBigQuery supports

[pabloem] [BEAM-9691] Ensuring BQSource is avoided on FnApi

[pabloem] [BEAM-9715] Ensuring annotations_test passes in all

[github] Name the pipeline_v1 proto import

[github] Update materialize_test.go

[ankurgoenka] [BEAM-9707] Hardcode Unified harness image for fixing dataflow VR 2

[crites] Updates documentation for WINDOWED_VALUE coder.

[rohde.samuel] Fix flaky interactive_runner_test

[github] Merge pull request #11205 [BEAM-9578] Defer expensive artifact

[robertwb] Update go protos.

[github] Fix some Go SDK linter/vet warnings. (#11330)

[robertwb] [BEAM-9577] Plumb resources through Python job service and runner.

[robertwb] [BEAM-9618] Pull bundle descriptors for Go.

[github] [BEAM-9529] Remove datastore.v1, googledatastore (#11175)

[github] Update session.go

[github] Update stage.go

[github] Update server_test.go

[github] Update materialize.go

[github] Update materialize_test.go

[github] Update stage_test.go

[github] Update artifact.go

[github] Update provision.go

[github] Update retrieval.go

[github] Update staging.go

[github] Update translate.go

[github] Update datamgr.go

[github] Update datamgr_test.go

[github] Update logging.go

[github] Update logging_test.go

[github] Update monitoring.go

[github] Update session.go

[github] Update statemgr.go

[github] Update statemgr_test.go

[github] Update replace.go

[github] Update replace_test.go

[github] Update provision.go

[github] Update execute.go

[github] Update job.go

[github] Update translate.go

[github] Update translate.go

[github] Update job.go

[github] Update materialize.go

[mxm] [BEAM-9580] Allow Flink 1.10 processing timers to finish on pipeline

[kamil.wasilewski] [BEAM-9721] Add --region to Dataflow-based load tests

[kamil.wasilewski] [BEAM-9721] LoadTestConfig: handle --region parameter and put default

[github] [BEAM-9147] Add a VideoIntelligence transform to Java SDK (#11261)

[mxm] Revert "[BEAM-9580] Downgrade Flink version to 1.9 for Nexmark and

[kcweaver] [BEAM-9714] [Go SDK] Require --region flag in Dataflow runner.

[github] Update translate.go

[mxm] [BEAM-9557] Fix strings used to verify test output

[github] Update session.go

[github] Update materialize_test.go

[mxm] [BEAM-9596] Ensure metrics are available in PipelineResult when the

[samuelw] Ensure that empty messages are not flushed to handler.

[crites] Uses iterable coder for windows and copies all of timestamp encoding

[github] Update session.go (#11352)

[github] [BEAM-9618] Java SDK worker support for pulling bundle descriptors.

[chamikara] Adds nose back under packages needed for testing.

[robertwb] [BEAM-9618] Mark push registration as deprecated.

[github] [Beam-9063]update documentation (#10952)

[kcweaver] [BEAM-9726] [py] Make region optional for non-service Dataflow.

[kcweaver] [BEAM-9726] [java] Make region optional for non-service runner.

[github] [BEAM-9550] Increase JVM Metaspace size for the TaskExecutors. (#11193)

[github] [BEAM-9721]Conditionally add Dataflow region to Dataflow-based

[michael.jacoby] [BEAM-9647] fixes MQTT clientId to long

[lcwik] [BEAM-4374] Fix missing deletion of metrics.

[github] [BEAM-8280] Document Python 3 annotations support (#11232)

[github] [BEAM-9731] Include more detail in passert.Equals errors. (#11359)

[github] [BEAM-9085] Fix performance regression in SyntheticSource on Python 3

[amaliujia] add a known issue

[samuelw] [BEAM-9651] Prevent StreamPool and stream initialization livelock

[boyuanz] [BEAM-9562, BEAM-6274] Fix-up timers to use Elements.Timer proto in data

[robertwb] Allow unset write threshold for state backed iterable coder.

[github] Revert "[BEAM-9651] Prevent StreamPool and stream initialization

[ankurgoenka] [BEAM-9735] Adding Always trigger and using it in Reshuffle

[samuelw] [BEAM-9651] Prevent StreamPool and stream initialization livelock

[github] [BEAM-9727] Automatically set required experiment flags for dataflow

[github] Update environments.py to add a method to specify container image

[kcweaver] Moving to 2.22.0-SNAPSHOT on master branch.

[kamil.wasilewski] [BEAM-8671] Migrate Load Tests to Python 3.7

[michal.walenia] [BEAM-9734] Revert #11122

[github] Add --region to changelog

[pabloem] Fix from_container_image call

[ankurgoenka] TOIL: Update Unified worker image

[boyuanz] [BEAM-9562] Update Element.timer, Element.Timer to Element.timers and

[robertwb] Comments and clarification.

[github] [BEAM-9443] support direct_num_workers=0 (#11372)

[chamikara] Updates Dataflow stateful DoFn setup to support external transforms

[github] [BEAM-9738] Update dataflow to setup correct docker environment options.

[github] [BEAM-9136]Add licenses for dependencies for Java (#11243)

[kcweaver] [BEAM-9744] Add missing region option to py perf tests.

[lcwik] [BEAM-9562] Fix output timestamp to be inferred from scheduled time when

[kcweaver] [BEAM-9744] Remove --region option from SQL tests.

[lcwik] [BEAM-2939] Update unbounded source as SDF wrapper to resume

[pabloem] Fixing type names for BQ Avro Tools

[github] Merge pull request #11389 from Refactor the BCJ and capture controls to

[github] [BEAM-i9751] upgrade zetasql to 2020.04.1 (#11410)

[xhan] Documentation bug fix for FlatMapElements#via() SimpleFunction in the

[github] [BEAM-9650] Add PeriodicImpulse Transform and slowly changing side input

[github] [BEAM-7923] Screendiff Integration Tests (#11338)

[lcwik] fixup! Fix spotbugs warning

[kcweaver] [BEAM-9756] Nexmark: only use --region in Dataflow.

[github] [BEAM-9642] Add SDF execution units. (#11327)

[lcwik] [BEAM-9577] Fix test to create urls from paths which are compatible with

[github] [BEAM-9136] reduce third_party_dependencies size (#11416)

[thw] Fix py37-lint

[thw] Maven compatible publish repository authentication via settings.xml

[github] [BEAM-9746] check for 0 length copies from state (#11413)

[pabloem] Removing underscore from _ReadFromBigQuery to make it external. It

[daniel.o.programmer] [BEAM-9642] Fix infinite recursion.

[kamil.wasilewski] Remove outdated doc for ReadFromBigQuery transform

[ehudm] [BEAM-9119] Disable flaky test

[github] [BEAM-8889] add gRPC suport in GCS connector (behind an

[amaliujia] fixup! update 2.20.0 date

[github] [BEAM-9729, BEAM-8486] Runner-side bundle registration cleanup. (#11358)

[github] Add new release 2.20.0 to beam website (#11285)

[jkw] Fix typo

[github] Merge pull request #11151 from [BEAM-9468]  Hl7v2 io


------------------------------------------
[...truncated 158.87 KB...]
Successfully pulled java_third_party_licenses/randomizedtesting-runner-2.5.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/randomizedtesting-runner-2.7.5.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/hppc-0.5.4.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/hppc-0.7.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/hppc-0.7.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/hppc-0.8.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/stream-2.5.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/stream-2.7.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/stream-2.9.5.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/cassandra-driver-core-3.8.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/cassandra-driver-extras-3.6.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/cassandra-driver-mapping-3.8.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/netlet-1.3.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/parso-2.0.11.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.html
Successfully pulled java_third_party_licenses/kryo-2.21.jar/LICENSE from http://www.opensource.org/licenses/bsd-license.php
Successfully pulled java_third_party_licenses/kryo-2.24.0.jar/LICENSE from http://www.opensource.org/licenses/bsd-license.php
Successfully pulled java_third_party_licenses/minlog-1.2.jar/LICENSE from http://www.opensource.org/licenses/bsd-license.php
Successfully pulled java_third_party_licenses/reflectasm-1.07.jar/LICENSE from http://www.opensource.org/licenses/bsd-license.php
Successfully pulled java_third_party_licenses/kryo-4.0.2.jar/LICENSE from https://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/kryo-shaded-4.0.2.jar/LICENSE from https://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/minlog-1.3.0.jar/LICENSE from http://www.opensource.org/licenses/bsd-license.php
Successfully pulled java_third_party_licenses/reflectasm-1.11.3.jar/LICENSE from http://www.opensource.org/licenses/bsd-license.php
Successfully pulled java_third_party_licenses/esri-geometry-api-2.2.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
License/notice for hadoop-apache2-3.2.0-1 were pulled automatically.
License/notice for jackson-annotations-2.10.0 were pulled automatically.
License/notice for jackson-annotations-2.10.2 were pulled automatically.
License/notice for jackson-core-2.10.0 were pulled automatically.
License/notice for jackson-core-2.10.2 were pulled automatically.
License/notice for jackson-databind-2.10.0 were pulled automatically.
License/notice for jackson-databind-2.10.2 were pulled automatically.
Successfully pulled java_third_party_licenses/jackson-dataformat-cbor-2.10.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jackson-dataformat-csv-2.10.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
License/notice for jackson-dataformat-smile-2.5.4 were pulled automatically.
Successfully pulled java_third_party_licenses/jackson-dataformat-smile-2.8.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jackson-dataformat-smile-2.8.10.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jackson-dataformat-smile-2.8.11.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jackson-dataformat-smile-2.8.6.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
License/notice for jackson-dataformat-xml-2.10.2 were pulled automatically.
License/notice for jackson-dataformat-yaml-2.10.2 were pulled automatically.
License/notice for jackson-dataformat-yaml-2.9.8 were pulled automatically.
License/notice for jackson-datatype-joda-2.10.2 were pulled automatically.
License/notice for jackson-module-jaxb-annotations-2.10.2 were pulled automatically.
License/notice for jackson-module-paranamer-2.10.2 were pulled automatically.
License/notice for jackson-module-scala_2.11-2.10.2 were pulled automatically.
Successfully pulled java_third_party_licenses/woodstox-core-5.0.3.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
License/notice for woodstox-core-6.0.3 were pulled automatically.
Successfully pulled java_third_party_licenses/caffeine-2.2.6.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/caffeine-2.7.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
License/notice for jai-imageio-core-1.4.0 were pulled automatically.
Successfully pulled java_third_party_licenses/jamm-0.3.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0
Successfully pulled java_third_party_licenses/jffi-1.2.16.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jnr-constants-0.9.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jnr-ffi-2.1.7.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jnr-posix-3.0.44.jar/LICENSE from http://www.gnu.org/licenses/lgpl.html
Successfully pulled java_third_party_licenses/jnr-x86asm-1.0.2.jar/LICENSE from http://www.opensource.org/licenses/mit-license.php
License/notice for dropwizard-metrics-hadoop-metrics2-reporter-0.1.0 were pulled automatically.
Successfully pulled java_third_party_licenses/lzma-java-1.3.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/junrar-4.0.0.jar/LICENSE from https://raw.github.com/junrar/junrar/master/license.txt
Successfully pulled java_third_party_licenses/software-and-algorithms-1.0.jar/LICENSE from http://www.opensource.org/licenses/mit-license.php
Successfully pulled java_third_party_licenses/embedded-redis-0.6.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/zstd-jni-1.3.8-3.jar/LICENSE from https://opensource.org/licenses/BSD-2-Clause
Successfully pulled java_third_party_licenses/openjson-1.0.11.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/better-files_2.12-2.17.1.jar/LICENSE from https://github.com/pathikrit/better-files/blob/master/LICENSE
Successfully pulled java_third_party_licenses/snowball-stemmer-1.3.0.581.1.jar/LICENSE from http://www.opensource.org/licenses/bsd-license.html
Successfully pulled java_third_party_licenses/scopt_2.11-3.5.0.jar/LICENSE from http://www.opensource.org/licenses/mit-license.php
License/notice for spotbugs-3.1.12 were pulled automatically.
Successfully pulled java_third_party_licenses/spotbugs-annotations-3.1.12.jar/LICENSE from https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html
Successfully pulled java_third_party_licenses/compiler-0.9.3.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/findbugs-annotations-1.3.9-1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jcip-annotations-1.0-1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_th
> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: commit='386d4e5f4f92f86e6aec85985761bba4b938a2d5', urls=[https://code.googlesource.com/google-api-go-client]
Resolving google.golang.org/genproto: commit='2b5a72b8730b0b16380010cfe5286c42108d88e7', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='7646b5360d049a7ca31e9133315db43456f39e2e', urls=[https://github.com/grpc/grpc-go]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:java:container:generateThirdPartyLicenses
ird_party_licenses/named-regexp-0.2.3.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0
Successfully pulled java_third_party_licenses/curvesapi-1.06.jar/LICENSE from http://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/annotations-4.1.1.4.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0
Successfully pulled java_third_party_licenses/google-api-client-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-api-client-jackson2-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-api-client-java6-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/grpc-google-cloud-bigtable-admin-v2-1.9.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/grpc-google-cloud-bigtable-v2-1.9.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/grpc-google-cloud-pubsub-v1-1.85.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/grpc-google-common-protos-1.12.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/grpc-google-common-protos-1.17.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-bigquerystorage-v1-0.90.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-bigquerystorage-v1alpha2-0.90.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-bigquerystorage-v1beta1-0.85.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-bigquerystorage-v1beta2-0.90.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-bigtable-admin-v2-1.9.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-bigtable-v2-1.9.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-datacatalog-v1beta1-0.32.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-datastore-v1-0.85.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-pubsub-v1-1.85.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-spanner-admin-database-v1-1.49.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-spanner-admin-instance-v1-1.49.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-spanner-v1-1.49.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-video-intelligence-v1-1.2.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-video-intelligence-v1beta2-0.84.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-video-intelligence-v1p1beta1-0.84.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-video-intelligence-v1p2beta1-0.84.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-cloud-video-intelligence-v1p3beta1-0.84.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-common-protos-1.12.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-common-protos-1.17.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/proto-google-iam-v1-0.13.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/api-common-1.8.1.jar/LICENSE from https://github.com/googleapis/api-common-java/blob/master/LICENSE
Successfully pulled java_third_party_licenses/gax-1.54.0.jar/LICENSE from https://github.com/googleapis/gax-java/blob/master/LICENSE
Successfully pulled java_third_party_licenses/gax-grpc-1.54.0.jar/LICENSE from https://github.com/googleapis/gax-java/blob/master/LICENSE
Successfully pulled java_third_party_licenses/gax-httpjson-0.71.0.jar/LICENSE from https://github.com/googleapis/gax-java/blob/master/LICENSE
Successfully pulled java_third_party_licenses/google-api-services-bigquery-v2-rev20191211-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-api-services-clouddebugger-v2-rev20200313-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-api-services-cloudresourcemanager-v1-rev20200311-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-api-services-dataflow-v1b3-rev20200305-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-api-services-healthcare-v1beta1-rev20200307-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-api-services-pubsub-v1-rev20200312-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-api-services-storage-v1-rev20200226-1.30.9.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-auth-library-credentials-0.18.0.jar/LICENSE from http://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/google-auth-library-credentials-0.19.0.jar/LICENSE from http://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/google-auth-library-oauth2-http-0.19.0.jar/LICENSE from http://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/auto-service-1.0-rc2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/auto-service-1.0-rc6.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/auto-service-annotations-1.0-rc6.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/auto-value-1.7.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/auto-value-annotations-1.6.3.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/auto-value-annotations-1.7.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/auto-common-0.10.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/auto-common-0.4.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/gcsio-2.1.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/util-2.1.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/bigtable-client-core-1.13.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-dataflow-java-proto-library-all-0.5.160304.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/datastore-v1-proto-client-1.6.3.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-bigquery-1.108.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-bigquerystorage-0.125.0-beta.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-bigtable-1.9.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-core-1.92.2.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-core-grpc-1.92.2.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-core-http-1.93.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-datacatalog-0.32.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-spanner-1.49.1.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-cloud-video-intelligence-1.2.0.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jformatstring-3.0.0.jar/LICENSE from http://www.gnu.org/licenses/lgpl.html
Successfully pulled java_third_party_licenses/jsr305-3.0.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/jsr305-3.0.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
License/notice for google-collect-snapshot-20080530 were pulled automatically.
Successfully pulled java_third_party_licenses/gson-2.2.4.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/gson-2.8.5.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/gson-2.8.6.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_annotation-2.3.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_annotation-2.3.4.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_annotations-2.0.15.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_annotations-2.1.3.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_annotations-2.2.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_annotations-2.3.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_annotations-2.3.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_annotations-2.3.3.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_annotations-2.3.4.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_check_api-2.3.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_check_api-2.3.4.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_core-2.3.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_core-2.3.4.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_type_annotations-2.3.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/error_prone_type_annotations-2.3.4.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/javac-9+181-r4173-1.jar/LICENSE from http://openjdk.java.net/legal/gplv2+ce.html
Successfully pulled java_third_party_licenses/flogger-0.5.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/flogger-system-backend-0.5.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-extensions-0.5.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/failureaccess-1.0.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/guava-19.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/guava-23.5-jre.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/guava-25.1-jre.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/guava-26.0-jre.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/guava-27.0.1-jre.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/guava-28.0-jre.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/guava-28.1-android.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/guava-testlib-25.1-jre.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-http-client-1.34.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-http-client-appengine-1.34.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-http-client-jackson-1.29.2.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-http-client-jackson2-1.34.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-http-client-protobuf-1.34.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
License/notice for guice-assistedinject-3.0 were pulled automatically.
License/notice for guice-servlet-3.0 were pulled automatically.
License/notice for guice-3.0 were pulled automatically.
Successfully pulled java_third_party_licenses/j2objc-annotations-1.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/j2objc-annotations-1.3.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-oauth-client-1.30.6.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/google-oauth-client-java6-1.30.6.jar/LICENSE from https://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/protobuf-javanano-3.0.0-alpha-5.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/protobuf-java-3.11.0.jar/LICENSE from https://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/protobuf-java-3.11.1.jar/LICENSE from https://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/protobuf-java-3.4.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Traceback (most recent call last):
  File "sdks/java/container/license_scripts/pull_licenses_java.py", line 138, in <module>
    license_url = dep['moduleLicenseUrl']
KeyError: 'moduleLicenseUrl'
Successfully pulled java_third_party_licenses/protobuf-java-util-3.11.0.jar/LICENSE from https://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/protobuf-java-util-3.11.1.jar/LICENSE from https://opensource.org/licenses/BSD-3-Clause
Successfully pulled java_third_party_licenses/protoc-3.11.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/protoc-3.11.1.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt
Successfully pulled java_third_party_licenses/zetasketch-0.1.0.jar/LICENSE from http://www.apache.org/licenses/LICENSE-2.0.txt

> Task :sdks:java:container:generateThirdPartyLicenses FAILED
> Task :sdks:go:installDependencies
> Task :sdks:java:container:copyDockerfileDependencies
> Task :sdks:java:container:dockerClean UP-TO-DATE

> Task :sdks:java:container:goPrepare
Use project GOPATH: <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/sdks/java/container/.gogradle/project_gopath>

> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild

> Task :sdks:java:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/sdks/go>

> Task :sdks:java:container:installDependencies
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:container:generateThirdPartyLicenses'.
> Process 'command './sdks/java/container/license_scripts/license_script.sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 2s
51 actionable tasks: 38 executed, 12 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/gpvux2egq6tv4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #13

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/13/display/redirect?page=changes>

Changes:

[crites] Clean up of TestStreamTranscriptTests. Remvoes check for final field in

[crites] Adds clearing of pane info state when windows get merged away.

[lcwik] [BEAM-9430] Migrate from ProcessContext#updateWatermark to

[lcwik] [BEAM-9540] Rename beam:source:runner:0.1/beam:sink:runner:0.1 to

[robertwb] [BEAM-9535] Remove unused ParDoPayload.Parameters.

[robertwb] [BEAM-9339] Declare capabilities in the Java SDK.

[lcwik] [BEAM-4374] Define the protos for a "short" id mechanism for metrics

[robertwb] [BEAM-9339] Add additional Java capabilities.

[jfarr1] [BEAM-9470] fix flaky unit test in :sdks:java:io:kinesis

[github] [BEAM-9551] Environment PB Pointer cleanup (#11164)

[iemejia] Move CHANGES template related items into template section

[lcwik] fixup! Address PR comments.

[github] Merge pull request #11166 from [BEAM-7923] Emit info when capture

[github] fix typo at Python Package name (#11098)

[github]  [BEAM-9552] Bump TestPubsub subscription creation ACK deadline to 60s

[daniel.o.programmer] [BEAM-3301] Perform SDF validation (missing RestrictionTrackers).

[github] Merge pull request #11128 from [BEAM-9524] Fix for ib.show() executing

[lcwik] [BEAM-9430] Update CHANGES.md to reflect removal of


------------------------------------------
[...truncated 78.15 KB...]
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-13-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-streaming-13
Warning: Permanently added 'compute.6350465416834909336' (ECDSA) to the list of known hosts.
20/03/20 12:47:09 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-streaming-13-m/10.128.0.125:8032
+ read line
+ echo application_1584708356546_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371
application_1584708356546_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371
++ echo application_1584708356546_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371
++ sed 's/ .*//'
+ application_ids[$i]=application_1584708356546_0001
++ echo application_1584708356546_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371
++ sed 's/.*beam-loadtests-java-portable-flink-streaming-13/beam-loadtests-java-portable-flink-streaming-13/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371'
Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-13-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest --flink-master=beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-13'
5e91aecf7b88b0ae4f75bad4f1f3cb9e8c1dfb9b70e3831d35b6a2310b0d7674
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-streaming-13-m '--command=curl -s "http://beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371/jobmanager/config"'
+ local 'job_server_config=[{"key":"taskmanager.memory.process.size","value":"12 gb"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.10.0.jar"},{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1584708356546_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"execution.target","value":"yarn-per-job"},{"key":"web.tmpdir","value":"/tmp/flink-web-b8a22da7-a7b9-4366-98f1-ac460c730c7e"},{"key":"jobmanager.rpc.port","value":"39963"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1584708356546_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal
++ echo '[{"key":"taskmanager.memory.process.size","value":"12' 'gb"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.10.0.jar"},{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1584708356546_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"execution.target","value":"yarn-per-job"},{"key":"web.tmpdir","value":"/tmp/flink-web-b8a22da7-a7b9-4366-98f1-ac460c730c7e"},{"key":"jobmanager.rpc.port","value":"39963"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1584708356546_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.heap.size","value":"12288m"}]'
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
+ local jobmanager_rpc_port=39963
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-13-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371 -L 39963:beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:39963 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-13-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371 -L 39963:beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:39963 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-13-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:35371 -L 39963:beam-loadtests-java-portable-flink-streaming-13-w-14.c.apache-beam-testing.internal:39963 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins2906156011005745951.sh
+ echo src Load test: fanout 8 times with 1GB 10-byte records total on Flink in Portable mode src
src Load test: fanout 8 times with 1GB 10-byte records total on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_5 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_5 --sourceOptions={"numRecords":12500000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=8 --iterations=1 --topCount=20 --sdkWorkerParallelism=16 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest --defaultEnvironmentType=DOCKER --experiments=beam_fn_api --inputWindowDurationSec=1200 --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
Mar 20, 2020 12:47:24 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Collect end time metric
Mar 20, 2020 12:47:26 PM org.apache.beam.runners.portability.PortableRunner run
INFO: Using job server endpoint: localhost:8099
Mar 20, 2020 12:47:27 PM org.apache.beam.runners.portability.PortableRunner run
INFO: PrepareJobResponse: preparation_id: "load0tests0java0portable0flink0streaming0combine05-jenkins-0320124725-b79ca355_bbf676e9-3ae9-46e3-a181-b6167ab95377"
artifact_staging_endpoint {
  url: "localhost:8098"
}
staging_session_token: "{\"sessionId\":\"load0tests0java0portable0flink0streaming0combine05-jenkins-0320124725-b79ca355_bbf676e9-3ae9-46e3-a181-b6167ab95377\",\"basePath\":\"gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-13\"}"

Mar 20, 2020 12:47:27 PM org.apache.beam.runners.core.construction.ArtifactServiceStager stage
INFO: Staging 175 files (token: {"sessionId":"load0tests0java0portable0flink0streaming0combine05-jenkins-0320124725-b79ca355_bbf676e9-3ae9-46e3-a181-b6167ab95377","basePath":"gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-13"})
Mar 20, 2020 12:47:40 PM org.apache.beam.runners.core.construction.ArtifactServiceStager stageManifest
INFO: Staged 175 files (token: {"sessionId":"load0tests0java0portable0flink0streaming0combine05-jenkins-0320124725-b79ca355_bbf676e9-3ae9-46e3-a181-b6167ab95377","basePath":"gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-13"})
Mar 20, 2020 12:47:41 PM org.apache.beam.runners.portability.PortableRunner run
INFO: RunJobResponse: job_id: "load0tests0java0portable0flink0streaming0combine05-jenkins-0320124725-b79ca355_b9623fac-a5c8-4e19-b392-4560e657a1fd"

Exception in thread "main" java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1584708356546_0001_01_000002 timed out.
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:98)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:99)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1584708356546_0001_01_000002 timed out.
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1928)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:90)
	... 3 more
Caused by: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1584708356546_0001_01_000002 timed out.
	at org.apache.beam.runners.portability.JobServicePipelineResult.propagateErrors(JobServicePipelineResult.java:165)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:110)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
	at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 12m 11s
63 actionable tasks: 8 executed, 7 from cache, 48 up-to-date

Publishing build scan...
https://gradle.com/s/iau774vmryj52

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #12

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/12/display/redirect?page=changes>

Changes:

[apilloud] [BEAM-7832] Translate ZetaSQL joins without condition

[github] [BEAM-9526] Add missing unmarshalling in top.LargestPerKey. (#11143)

[github] Merge pull request #11147 from [BEAM-7923] Support dict and iterable

[coheigea] BEAM-8924 - Update Apache Tika to 1.24

[kawaigin] [BEAM-7923] Change Transform Label Prefix Syntax

[github] Specify return types of window start/end functions explicitly (#11152)

[apilloud] [BEAM-9511] Uncollect takes arbitrary expressions

[apilloud] [BEAM-9515] Add test

[kcweaver] [BEAM-9553] Use latest Flink job server image as default.

[github] Merge pull request #11158 from [BEAM-9533] Fixing tox.ini variants

[alex] [BEAM-9035] BIP-1: Typed options for Row Schema and Field

[iemejia] [BEAM-9279] Refactor HBase to disminish relying on Serializable wrappers

[iemejia] [BEAM-9279] Make HBase.ReadAll based on Reads instead of HBaseQuery


------------------------------------------
[...truncated 78.65 KB...]
20/03/19 12:39:50 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-streaming-12-m/10.128.0.27:8032
+ read line
+ echo application_1584621499758_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873
application_1584621499758_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873
++ echo application_1584621499758_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873
++ sed 's/ .*//'
+ application_ids[$i]=application_1584621499758_0001
++ echo application_1584621499758_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873
++ sed 's/.*beam-loadtests-java-portable-flink-streaming-12/beam-loadtests-java-portable-flink-streaming-12/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873'
Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-12-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest --flink-master=beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-12'
413e677adc43b7cf22d6de1866d1528d65af445a8c06dbbf6a7059b5b6e6261c
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-streaming-12-m '--command=curl -s "http://beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873/jobmanager/config"'
+ local 'job_server_config=[{"key":"taskmanager.memory.process.size","value":"12 gb"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.10.0.jar"},{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1584621499758_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"execution.target","value":"yarn-per-job"},{"key":"web.tmpdir","value":"/tmp/flink-web-07fb3bfe-ddbb-4e92-9031-3ffa5a042a01"},{"key":"jobmanager.rpc.port","value":"38677"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1584621499758_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal
++ echo '[{"key":"taskmanager.memory.process.size","value":"12' 'gb"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.10.0.jar"},{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1584621499758_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"execution.target","value":"yarn-per-job"},{"key":"web.tmpdir","value":"/tmp/flink-web-07fb3bfe-ddbb-4e92-9031-3ffa5a042a01"},{"key":"jobmanager.rpc.port","value":"38677"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1584621499758_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.heap.size","value":"12288m"}]'
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
+ local jobmanager_rpc_port=38677
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-12-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873 -L 38677:beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:38677 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-12-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873 -L 38677:beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:38677 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-12-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:37873 -L 38677:beam-loadtests-java-portable-flink-streaming-12-w-7.c.apache-beam-testing.internal:38677 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins1209566164238109162.sh
+ echo src Load test: fanout 8 times with 1GB 10-byte records total on Flink in Portable mode src
src Load test: fanout 8 times with 1GB 10-byte records total on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_5 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_5 --sourceOptions={"numRecords":12500000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=8 --iterations=1 --topCount=20 --sdkWorkerParallelism=16 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest --defaultEnvironmentType=DOCKER --experiments=beam_fn_api --inputWindowDurationSec=1200 --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar

> Task :sdks:java:io:synthetic:compileJava
Note: <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:io:synthetic:classes
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kinesis:compileJava
> Task :sdks:java:io:kinesis:classes
> Task :sdks:java:io:kinesis:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar
> Task :runners:portability:java:compileJava
> Task :runners:portability:java:classes
> Task :runners:portability:java:jar

> Task :sdks:java:testing:load-tests:run
Mar 19, 2020 12:40:26 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Collect end time metric
Mar 19, 2020 12:40:28 PM org.apache.beam.runners.portability.PortableRunner run
INFO: Using job server endpoint: localhost:8099
Mar 19, 2020 12:40:29 PM org.apache.beam.runners.portability.PortableRunner run
INFO: PrepareJobResponse: preparation_id: "load0tests0java0portable0flink0streaming0combine05-jenkins-0319124027-cf6b062c_f6a0e926-96c4-4d8b-a473-4518ec9ad055"
artifact_staging_endpoint {
  url: "localhost:8098"
}
staging_session_token: "{\"sessionId\":\"load0tests0java0portable0flink0streaming0combine05-jenkins-0319124027-cf6b062c_f6a0e926-96c4-4d8b-a473-4518ec9ad055\",\"basePath\":\"gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-12\"}"

Mar 19, 2020 12:40:29 PM org.apache.beam.runners.core.construction.ArtifactServiceStager stage
INFO: Staging 175 files (token: {"sessionId":"load0tests0java0portable0flink0streaming0combine05-jenkins-0319124027-cf6b062c_f6a0e926-96c4-4d8b-a473-4518ec9ad055","basePath":"gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-12"})
Mar 19, 2020 12:40:43 PM org.apache.beam.runners.core.construction.ArtifactServiceStager stageManifest
INFO: Staged 175 files (token: {"sessionId":"load0tests0java0portable0flink0streaming0combine05-jenkins-0319124027-cf6b062c_f6a0e926-96c4-4d8b-a473-4518ec9ad055","basePath":"gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-12"})
Mar 19, 2020 12:40:44 PM org.apache.beam.runners.portability.PortableRunner run
INFO: RunJobResponse: job_id: "load0tests0java0portable0flink0streaming0combine05-jenkins-0319124027-cf6b062c_80d0cb9f-e17a-454d-89ae-88b8c20e693d"

Exception in thread "main" java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1584621499758_0001_01_000002  timed out.
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:98)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:99)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1584621499758_0001_01_000002  timed out.
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1928)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:90)
	... 3 more
Caused by: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1584621499758_0001_01_000002  timed out.
	at org.apache.beam.runners.portability.JobServicePipelineResult.propagateErrors(JobServicePipelineResult.java:165)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:110)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
	at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 12m 42s
63 actionable tasks: 12 executed, 3 from cache, 48 up-to-date

Publishing build scan...
https://gradle.com/s/odjsa2sd23j44

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #11

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/11/display/redirect?page=changes>

Changes:

[jfarr1] [BEAM-8374] Add alternate SnsIO PublishResult coders

[kcweaver] [BEAM-9503] Insert missing comma in process worker script.

[kcweaver] [BEAM-8866] Use unique temp dir for output of portable word count tests.

[davidyan] [BEAM-9510] Fixing version incompatibities in

[davidyan] Bring the dep versions up to par with

[valentyn] Install typing only on 3.5.2 or earlier versions of Python.

[kawaigin] [BEAM-7923] Include side effects in p.run

[github] [BEAM-9498] Include descriptor and type of unsupported fields in RowJson

[github] Merge pull request #11149 from [BEAM-9533] Adding tox cloud tests

[github] Flink 1.10 yarn deployment fix (#11146)

[github] [BEAM-9539] Fix copy-pasted comment in load-tests' build.gradle (#11155)


------------------------------------------
[...truncated 75.97 KB...]
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=17
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-streaming-11 --region=global --num-workers=17 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/dc7f197a-4155-3751-933a-4f9ce6e30926].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
...............................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-streaming-11] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-11-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-streaming-11
Warning: Permanently added 'compute.5116960911187224432' (ECDSA) to the list of known hosts.
20/03/18 12:56:00 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-streaming-11-m/10.128.0.71:8032
+ read line
+ echo application_1584536068077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221
application_1584536068077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221
++ echo application_1584536068077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221
++ sed 's/ .*//'
+ application_ids[$i]=application_1584536068077_0001
++ echo application_1584536068077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221
++ sed 's/.*beam-loadtests-java-portable-flink-streaming-11/beam-loadtests-java-portable-flink-streaming-11/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221'
Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-11-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest --flink-master=beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-11'
85b7d7b1b8f7f6955963e3163b35406e6ec39c33d71058ca28f5b3bf8808847f
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-streaming-11-m '--command=curl -s "http://beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221/jobmanager/config"'
+ local 'job_server_config=[{"key":"taskmanager.memory.process.size","value":"12 gb"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.10.0.jar"},{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1584536068077_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"execution.target","value":"yarn-per-job"},{"key":"web.tmpdir","value":"/tmp/flink-web-11980282-bc2b-46d1-8137-8f3ee033c460"},{"key":"jobmanager.rpc.port","value":"33355"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1584536068077_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal
++ echo '[{"key":"taskmanager.memory.process.size","value":"12' 'gb"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.10.0.jar"},{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1584536068077_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"execution.target","value":"yarn-per-job"},{"key":"web.tmpdir","value":"/tmp/flink-web-11980282-bc2b-46d1-8137-8f3ee033c460"},{"key":"jobmanager.rpc.port","value":"33355"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1584536068077_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.heap.size","value":"12288m"}]'
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
+ local jobmanager_rpc_port=33355
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-11-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221 -L 33355:beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:33355 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-11-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221 -L 33355:beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:33355 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-11-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:44221 -L 33355:beam-loadtests-java-portable-flink-streaming-11-w-0.c.apache-beam-testing.internal:33355 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins7593058059167738087.sh
+ echo src Load test: fanout 8 times with 2GB 10-byte records total on Flink in Portable mode src
src Load test: fanout 8 times with 2GB 10-byte records total on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_5 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_5 --sourceOptions={"numRecords":25000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=8 --iterations=1 --topCount=20 --sdkWorkerParallelism=16 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest --defaultEnvironmentType=DOCKER --experiments=beam_fn_api --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:local-java:jar
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:shadowJar

> Task :sdks:java:io:synthetic:compileJava
Note: <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:io:synthetic:classes
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kinesis:compileJava
> Task :sdks:java:io:kinesis:classes
> Task :sdks:java:io:kinesis:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar
> Task :runners:portability:java:compileJava
> Task :runners:portability:java:classes
> Task :runners:portability:java:jar

> Task :sdks:java:testing:load-tests:run
Exception in thread "main" java.lang.IllegalStateException: GroupByKey cannot be applied to non-bounded PCollection in the GlobalWindow without a trigger. Use a Window.into or Window.triggering transform prior to GroupByKey.
	at org.apache.beam.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:156)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:226)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1596)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1485)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.loadTest(CombineLoadTest.java:134)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:96)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 19s
63 actionable tasks: 12 executed, 3 from cache, 48 up-to-date

Publishing build scan...
https://gradle.com/s/6p2yluhknbdew

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #10

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/10/display/redirect?page=changes>

Changes:

[git] Remove optionality and add sensible defaults to PubsubIO builders.

[jkai] [BEAM-8331] rewrite the calcite JDBC urls

[boyuanz] Update verify_release_build script to run python tests with dev version.

[robertwb] Supporting infrastructure for dataframes on beam.

[robertwb] Basic deferred data frame implementation.

[robertwb] yapf, py2

[filiperegadas] Add BigQuery useAvroLogicalTypes option

[filiperegadas] fixup! Add BigQuery useAvroLogicalTypes option

[jvilcek] [BEAM-9360] Fix equivalence check for FieldType

[github] typings and docs for expressions.py

[chamikara] Logs BQ insert failures

[iemejia] [BEAM-9384] Add SchemaRegistry.getSchemaCoder to get SchemaCoders for

[lcwik] [BEAM-9397] Pass all but output receiver parameters to start

[kcweaver] [BEAM-9401] bind Flink MiniCluster to localhost

[sunjincheng121] [BEAM-9288] Not bundle conscrypt in gRPC vendor

[mxm] [BEAM-9345] Fix source of test flakiness in FlinkSubmissionTest

[kamil.wasilewski] Add integration test for AnnotateImage transform

[github] Add integration test for AnnotateText transform (#10977)

[chadrik] [BEAM-9405] Fix post-commit error about create_job_service

[chadrik] more typing fixes

[chadrik] Fix typing issue with python 3.5.2

[chadrik] fixes

[chadrik] Address more issues discovered after rebase

[chadrik] Improve the idiom used for conditional imports

[chadrik] Fix more issues

[chadrik] Update to latest mypy version

[amaliujia] Moving to 2.21.0-SNAPSHOT on master branch.

[github] [BEAM-8487] Handle nested forward references (#10932)

[github] [BEAM-9287] Add Postcommit tests for dataflow runner v2  (#10998)

[chadrik] [BEAM-7746] Runtime change to timestamp/duration equality

[github] Adds DisplayData for StateSpecs used by stateful ParDos

[iemejia] Fix non correctly formatted class in sdks/java/core

[iemejia] [BEAM-9342[ Update bytebuddy to version 1.10.8

[aromanenko.dev] [BEAM-8925] Tika version update to 1.23

[12602502+Ardagan] [BEAM-8327] Override Gradle cache for community metrics prober

[ehudm] Reduce warnings in pytest runs.

[heejong] [BEAM-9415] fix postcommit xvr tests

[github] Merge pull request #10968 from [BEAM-9381] Adding display data to

[github] [BEAM-8335] Add PCollection to DataFrame logic for InteractiveRunner.

[robertwb] Remove excessive logging.

[github] [BEAM-2939] Java UnboundedSource SDF wrapper (#10897)

[iemejia] [website] Update link to environment_type (SDK harness configuration)

[iemejia] Fix typo on python code

[kamil.wasilewski] Fix: skip test if GCP dependencies are not installed

[fernandodiaz] [BEAM-9424] Allow grouping by LogicalType

[github] Revert "[BEAM-8335] Add PCollection to DataFrame logic for

[echauchot] Add metrics export to documentation on the website.

[github] [BEAM-8382] Add rate limit policy to KinesisIO.Read (#9765)

[lcwik] [BEAM-9288] Bump version number vendored gRPC build.

[chadrik] [BEAM-9274] Support running yapf in a git pre-commit hook

[rohde.samuel] [BEAM-8335] Add PCollection to Dataframe logic for InteractiveRunner.

[github] [BEAM-8575] Modified trigger test to work for different runners.

[github] [BEAM-9413] fix beam_PostCommit_Py_ValCon (#11023)

[rohde.samuel] ReverseTestStream Implementation

[github] Update lostluck's info on the Go SDK roadmap

[suztomo] Google-cloud-bigquery 1.108.0

[github] [BEAM-9432] Move expansion service into its own project. (#11035)

[ehudm] [BEAM-3713] Remove nosetests from tox.ini

[github] Merge pull request #11025: [BEAM-6428] Improve select performance with

[github] Switch contact email to apache.org.

[github] [BEAM-6374] Emit PCollection metrics from GoSDK (#10942)

[amaliujia] [BEAM-9288] Not bundle conscrypt in gRPC vendor in META-INF/

[kcweaver] [BEAM-9448] Fix log message for job server cache.

[github] Update container image tags used by Dataflow runner for Beam master

[github] [BEAM-8328] Disable community metrics integration test in 'test' task

[iemejia] [BEAM-9450] Update www.apache.org/dist/ links to downloads.apache.org

[iemejia] [BEAM-9450] Convert links available via https to use https

[github] Add integration test for AnnotateVideoWithContext transform (#10986)

[lcwik] [BEAM-9452] Update classgraph to latest version to resolve windows

[hktang] [BEAM-9453] Changed new string creation to use StandardCharsets.UTF_8

[chuck.yang] Use Avro format for file loads to BigQuery

[jkai] [Hotfix] fix rabbitmp spotless check

[kcweaver] Downgrade cache log level from warn->info.

[github] Revert "[BEAM-6374] Emit PCollection metrics from GoSDK (#10942)"

[github] Merge pull request #11032 from [BEAM-8335] Display rather than logging

[github] Fix a bug in performance test for reading data from BigQuery (#11062)

[suztomo] grpc 1.27.2 and gax 1.54.0

[suztomo] bigquerystorage 0.125.0-beta

[apilloud] [BEAM-9463] Bump ZetaSQL to 2020.03.1

[lcwik] [BEAM-2939, BEAM-9458] Add deduplication transform for SplittableDoFns

[lcwik] [BEAM-9464] Fix WithKeys to respect parameterized types

[ankurgoenka] [BEAM-9465] Fire repeatedly in reshuffle

[lcwik] [BEAM-2939, BEAM-9458] Use deduplication transform for UnboundedSources

[echauchot] Fix wrong generated code comment.

[github] [BEAM-9396] Fix Docker image name in CoGBK test for Python on Flink

[lcwik] [BEAM-9288] Update to use vendored gRPC without shaded conscrypt

[github] [BEAM-9319] Clean up start topic in TestPubsubSignal (#11072)

[lcwik] [BEAM-2939] Follow-up on comment in pr/11065

[lcwik] [BEAM-9473] Dont copy over META-INF index/checksum/signing files during

[apilloud] [BEAM-9411] Enable BigQuery DIRECT_READ by default in SQL

[hannahjiang] update CHANGE.md for 2.20

[lcwik] [BEAM-9475] Fix typos and shore up expectations on type

[rohde.samuel] BEAM[8335] TestStreamService integration with DirectRunner

[github] [BEAM-7926] Update Data Visualization (#11020)

[ankurgoenka] [BEAM-9402] Remove options overwrite

[chadrik] Add pre-commit hook for pylint

[github] Additional new Python Katas (#11078)

[github] [BEAM-9478] Update samza runner page to reflect post 1.0 changes

[suztomo] grpc-google-cloud-pubsub-v1 1.85.1

[pabloem] Updating BigQuery client APIs

[github] [BEAM-9481] Exclude signature files from expansion service test

[github] Install typing package only for Python < 3.5.3 (#10821)

[heejong] [BEAM-9056] Staging artifacts from environment

[sunjincheng121] [BEAM-9295] Add Flink 1.10 build target and Make FlinkRunner compatible

[ankurgoenka] [BEAM-9485] Raise error when transform urn is not implemented

[12602502+Ardagan] [BEAM-9431] Remove ReadFromPubSub/Read-out0-ElementCount from the

[github] Update Python roadmap for 2.7 eol

[mxm] [BEAM-9474] Improve robustness of BundleFactory and ProcessEnvironment

[github] [BEAM-7815] update MemoryReporter comments about using guppy3 (#11073)

[rohde.samuel] [BEAM-8335] Modify the StreamingCache to subclass the CacheManager

[sunjincheng121] [BEAM-9298] Drop support for Flink 1.7

[github] Fixing apache_beam.io.gcp.bigquery_test:PubSubBigQueryIT. at head

[mxm] [BEAM-9490] Guard referencing for environment expiration via a lock

[github] Verify schema early in ToJson and JsonToRow (#11105)

[lcwik] [BEAM-9481] fix indentation

[github] Merge pull request #11103 from [BEAM-9494] Reifying outputs from BQ file

[github] [BEAM-8335] Implemented Capture Size limitation (#11050)

[github] [BEAM-9294] Move RowJsonException out of RowJsonSerializer (#11102)

[github] Merge pull request #11046: [BEAM-9442] Properly handle nullable fields

[ankurgoenka] [BEAM-9287] disable validates runner test which uses teststreams for

[sunjincheng121] [BEAM-9299-PR]Upgrade Flink Runner 1.8x to 1.8.3 and 1.9x to 1.9.2

[lcwik] [BEAM-2939] Implement interfaces and concrete watermark estimators

[ankurgoenka] [BEAM-9499] Sickbay test_multi_triggered_gbk_side_input for streaming

[robertwb] Minor cleanup, lint.

[robertwb] [BEAM-9433] Create expansion service artifact for common Java IOs.

[thw] [BEAM-9490] Use the lock that belongs to the cache when bundle load

[github] Update Dataflow py container version (#11120)

[github] [BEAM-7923] Streaming support and pipeline pruning when instrumenting a

[github] Update default value in Java snippet

[ankurgoenka] [BEAM-9504] Sickbay streaming test for batch VR

[rohde.samuel] [BEAM-8335] Final PR to merge the InteractiveBeam feature branch

[github] [BEAM-9477] RowCoder should be hashable and picklable (#11088)

[apilloud] [BEAM-8057] Reject Infinite or NaN literals at parse time

[robertwb] Log in a daemon thread.

[thw] [BEAM-8815] Skip removal of manifest when no artifacts were staged.

[github] [BEAM-9346] Improve the efficiency of TFRecordIO (#11122)

[kawaigin] [BEAM-8335] Refactor IPythonLogHandler

[apilloud] [BEAM-8070] Preserve type for empty array

[github] Merge pull request #10991 [BEAM-3301] Refactor DoFn validation & allow

[github] Update dataflow py container ver to 20200317 (#11145)


------------------------------------------
[...truncated 41.54 KB...]
ac3e2c206c49: Layer already exists
3663b7fed4c9: Layer already exists
832f129ebea4: Layer already exists
6670e930ed33: Layer already exists
c7f27a4eb870: Layer already exists
e70dfb4c3a48: Layer already exists
1c76bd0dc325: Layer already exists
c3881ea6fdcf: Pushed
c0f158bb7e27: Pushed
80a789adf151: Pushed
db164c127812: Pushed
latest: digest: sha256:2da04d75aee454dae2e5e58b5ae470cf8c7301c3b397b3ed34366229805a4d44 size: 3470
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability -Pdocker-tag=latest :runners:flink:1.10:job-server-container:docker
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:flink:1.10:copyResourcesOverrides NO-SOURCE
> Task :runners:flink:1.10:job-server:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:flink:1.10:job-server-container:copyLicenses
> Task :runners:flink:1.10:job-server-container:dockerClean UP-TO-DATE
> Task :runners:flink:1.10:copySourceOverrides
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :runners:flink:1.10:copyTestResourcesOverrides NO-SOURCE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :runners:flink:1.10:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.10:compileJava FROM-CACHE
> Task :runners:flink:1.10:classes
> Task :runners:flink:1.10:jar
> Task :runners:flink:1.10:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.10:job-server:classes UP-TO-DATE
> Task :runners:flink:1.10:job-server:shadowJar
> Task :runners:flink:1.10:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.10:job-server-container:dockerPrepare
> Task :runners:flink:1.10:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 57s
61 actionable tasks: 17 executed, 6 from cache, 38 up-to-date

Publishing build scan...
https://gradle.com/s/x4b4lh3n5rbfg

[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins8998135900036901681.sh
+ echo 'Tagging image...'
Tagging image...
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins4942192846555960994.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins5272199491344354398.sh
+ echo 'Pushing image...'
Pushing image...
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins2493098167671188565.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server]
cc5b8f6da91b: Preparing
09e4376309cb: Preparing
9249809e65e4: Preparing
6b84f79b6d95: Preparing
3c9a565ae0aa: Preparing
ac3e2c206c49: Preparing
3663b7fed4c9: Preparing
832f129ebea4: Preparing
6670e930ed33: Preparing
c7f27a4eb870: Preparing
e70dfb4c3a48: Preparing
1c76bd0dc325: Preparing
6670e930ed33: Waiting
e70dfb4c3a48: Waiting
1c76bd0dc325: Waiting
ac3e2c206c49: Waiting
3663b7fed4c9: Waiting
832f129ebea4: Waiting
cc5b8f6da91b: Pushed
09e4376309cb: Pushed
ac3e2c206c49: Layer already exists
3663b7fed4c9: Layer already exists
832f129ebea4: Layer already exists
6670e930ed33: Layer already exists
9249809e65e4: Pushed
e70dfb4c3a48: Layer already exists
c7f27a4eb870: Layer already exists
1c76bd0dc325: Layer already exists
3c9a565ae0aa: Pushed
6b84f79b6d95: Pushed
latest: digest: sha256:d5e9223d88d8120b61f4bd59acd787e226bb738624e267241ba71d2b351295ef size: 2841
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
CLUSTER_NAME=beam-loadtests-java-portable-flink-streaming-10
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-10
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins3214212142738852914.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins4749361340216384720.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-java-portable-flink-streaming-10-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                -
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-streaming-10 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/6860445a-030a-33a0-9077-8e2b73d3c02e].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
.....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................WARNING: Cluster beam-loadtests-java-portable-flink-streaming-10 failed to create. Beginning automated resource cleanup process.
done.
ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/6860445a-030a-33a0-9077-8e2b73d3c02e] failed: Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/d5abfe1c-6208-464d-a2fc-36bd9314a4ff/beam-loadtests-java-portable-flink-streaming-10-m/dataproc-initialization-script-2_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #9

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/9/display/redirect?page=changes>

Changes:

[amogh.tiwari] lzo-addons

[amogh.tiwari] 3rd dec 2019, 12:43AM

[amogh.tiwari] PR corrections

[amogh.tiwari] PR javaPreCommit update

[amogh.tiwari] PR changes: added testLzopSpilttale()

[amogh.tiwari] updated gradle for supporting optional dependency of lzo- 2:39 AM IST

[iemejia] [BEAM-9162] Upgrade Jackson to version 2.10.2

[veblush] Upgrades gcsio to 2.0.0

[jonathan.warburton09] [BEAM-8916] Rename external_test_it so that it is picked up by pytest

[huangry] Create validation runner test for metrics (limited to user counter in

[millsd] Update Dataflow monitoring URL

[ankurgoenka] [BEAM-9287] Add Python Streaming Validates runner tests for Unified

[robertwb] Add capabilities and requirements to beam protos.

[github] Change static Map fields in ReflectUtils to be concurrent

[iemejia] [BEAM-8561] Add ThriftIO to support IO for Thrift files

[github] [BEAM-9258] Integrate Google Cloud Data loss prevention functionality

[github] [BEAM-9291] Upload graph option in dataflow's python sdk (#10829)

[amogh.tiwari] update 19/02/2020 2:32 AM added static class, removed wrappers, updated

[chlarsen] Removed compile time generation of test Thrift class.

[github] [BEAM-1080] Skip tests that required GCP credentials

[github] Exclude tests that are not passing under currect Avro IO requirements.

[lcwik] [BEAM-5605] Honor the bounded source timestamps timestamp.

[chlarsen] Added ThriftIO to list of supported I/O on website and to change log.

[github] [BEAM-7246] Added Google Spanner Write Transform (#10712)

[github] Apply suggestions from code review

[github] [BEAM-1833] Fixes BEAM-1833

[bhulette] Don't exclude UsesUnboundedPCollections in Dataflow VR tests

[heejong] [BEAM-9335] update hard-coded coder id when translating Java external

[huangry] Fixups.

[github] [BEAM-9146] Integrate GCP Video Intelligence functionality for Python

[iemejia] Mark Test categories as internal and improve categorization

[github] Add DataCatalogPipelineOptionsRegistrar (#10896)

[github] Allow unknown non-merging WindowFns of know window type. (#10875)

[iemejia] [BEAM-9326] Make JsonToRow transform input <String> instead of <?

[github] [BEAM-8575] Removed MAX_TIMESTAMP from testing data (#10835)

[github] Update python sdk container to beam-master-20200219 (#10903)

[heejong] [BEAM-9338] add postcommit XVR spark badges

[github] [BEAM-3545] Fix race condition w/plan metrics. (#10906)

[robertwb] Update go beam runner generated protos.

[heejong] [BEAM-9341] postcommit xvr flink fix

[github] Update

[github] Update

[github] Update

[github] Update

[github] Update

[github] Update

[github] Update

[github] Update

[shubham.srivastava] finishing touch 20/02/2020 6:43PM

[github] [BEAM-9085] Fix performance regression in SyntheticSource (#10885)

[github] Update google-cloud-videointelligence dependency

[robertwb] Add standard protocol capabilities to protos.

[github] [BEAM-8280] no_annotations decorator (#10904)

[kcweaver] [BEAM-9225] Fix Flink uberjar job termination bug.

[kcweaver] Reuse get_state method.

[chamikara] Updates DataflowRunner to support multiple SDK environments.

[github] [BEAM-8280] Enable and improve IOTypeHints debug_str traceback (#10894)

[github] [BEAM-9343]Upgrade ZetaSQL to 2020.02.1 (#10918)

[robertwb] [BEAM-9339] Declare capabilities for Go SDK.

[lcwik] [BEAM-5605] Eagerly close the BoundedReader once we have read everything

[github] [BEAM-9229] Adding dependency information to Environment proto (#10733)

[lcwik] [BEAM-9349] Update joda-time version

[lcwik] fixup! Fix SpotBugs failure

[kcweaver] [BEAM-9022] publish Spark job server Docker image

[drubinstein] Bump google cloud bigquery to 1.24.0

[github] Revert "[BEAM-9085] Fix performance regression in SyntheticSource

[github] [BEAM-8537] Provide WatermarkEstimator to track watermark (#10375)

[github] Make sure calling try_claim(0) more than once also trows exception.

[robertwb] [BEAM-9339] Declare capabilities for Python SDK.

[robertwb] Add some standard requirement URNs to the protos.

[kcweaver] [BEAM-9356] reduce Flink test logs to warn

[github] [BEAM-9063] migrate docker images to apache (#10612)

[github] [BEAM-9252] Exclude jboss's Main and module-info.java (#10930)

[boyuanz] Clean up and add type-hints to SDF API

[robertwb] [BEAM-9340] Populate requirements for Python DoFn properties.

[hannahjiang] fix postcommit failure

[robertwb] [BEAM-8019] Branch on having multiple environments.

[github] [BEAM-9359] Switch to Data Catalog client (#10917)

[github] [BEAM-9344] Add support for bundle finalization execution to the Beam

[iemejia] [BEAM-9342] Upgrade vendored bytebuddy to version 1.10.8

[chadrik] Create a class to encapsulate the work required to submit a pipeline to

[iemejia] Add Dataflow Java11 ValidatesRunner badge to the PR template

[github] Merge pull request #10944: [BEAM-7274] optimize oneOf handling

[github] [BEAM-8280] Fix IOTypeHints origin traceback on partials (#10927)

[relax] Support null fields in rows with ByteBuddy generated code.

[robertwb] Allow metrics update to be tolerant to uninitalized metric containers.

[github] [GoSDK] Fix race condition in statemgr & test (#10941)

[rohde.samuel] Move TestStream implementation to replacement transform

[github] [BEAM-9347] Don't overwrite default runner harness for unified worker

[boyuanz] Update docstring of ManualWatermarkEstimator.set_watermark()

[kcweaver] [BEAM-9373] Spark/Flink tests fix string concat

[boyuanz] Address comments

[boyuanz] Address comments again

[github] [BEAM-9228] Support further partition for FnApi ListBuffer (#10847)

[github] [BEAM-7926] Data-centric Interactive Part3 (#10731)

[boyuanz] Use NoOpWatermarkEstimator in sdf_direct_runner

[chamikara] Updates Dataflow client

[github] [BEAM-9240]: Check for Nullability in typesEqual() method of FieldType

[amogh.tiwari] 25/02/2020 updated imports Amogh Tiwari & Shubham Srivastava

[iemejia] [BEAM-8616] Make hadoop-client a provided dependency on ParquetIO

[mxm] [BEAM-9345] Remove workaround to restore stdout/stderr during JobGraph

[iemejia] [BEAM-9364] Refactor KafkaIO to use DeserializerProviders

[mxm] [BEAM-9345] Add end-to-end Flink job submission test

[iemejia] [BEAM-9352] Align version of transitive jackson dependencies with Beam

[michal.walenia] [BEAM-9258] Add integration test for Cloud DLP

[iemejia] [BEAM-9329] Support request of schemas by version on KafkaIO + CSR

[lcwik] [BEAM-9252] Update to vendored gRPC without problematic

[github] Update

[github] Update

[github] Update

[lcwik] [BEAM-2822, BEAM-2939, BEAM-6189, BEAM-4374] Enable passing completed

[crites] Changes TestStreamTranscriptTest to only emit two elements so that its

[alex] [BEAM-7274] Add DynamicMessage Schema support

[github] [BEAM-9322] Fix tag output names within Dataflow to be consistent with

[iemejia] [BEAM-9342] Exclude module-info.class from vendored Byte Buddy 1.10.8

[iemejia] Add KafkaIO support for Confluent Schema Registry to the CHANGEs file

[github] [BEAM-9247] Integrate GCP Vision API functionality (#10959)

[github] Fix kotlin warnings (#10976)

[github] Update python sdk container version to beam-master-20200225 (#10965)

[github] [BEAM-9248] Integrate Google Cloud Natural Language functionality for

[iemejia] Refine access level for `sdks/java/extensions/protobuf`

[github] [BEAM-9355] Basic support for NewType (#10928)

[github] [BEAM-8979] reintroduce mypy-protobuf stub generation (#10734)

[github] [BEAM-8335] Background Caching job (#10899)

[github] [BEAM-8458] Add option to set temp dataset in BigQueryIO.Read (#9852)

[iemejia] Make logger naming consistent with Apache Beam LOG standard

[kcweaver] [BEAM-9300] convert struct literal in ZetaSQL

[github] fix breakage (#10934)

[github] Merge pull request #10901 from [BEAM-8965] Remove duplicate sideinputs

[pabloem] Fix formatting

[github] [BEAM-8618] Tear down unused DoFns periodically in Python SDK harness.

[alex] [BEAM-9394] DynamicMessage handling of empty map violates schema

[github] Merge pull request #10854: State timers documentation

[lcwik] [BEAM-5524] Fix minor issue in style guide.

[github] [BEAM-8201] Pass all other endpoints through provisioning service.

[suztomo] Linkage Checker 1.1.4

[robinyqiu] Bump Dataflow Java worker container version

[kcweaver] Test schema does not need to be nullable.

[github] [BEAM-9396] Match Docker image names in Jenkins jobs with those

[github] [BEAM-9392] Fix Multi TestStream assertion errors (#10982)


------------------------------------------
[...truncated 73.61 KB...]
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                -
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-streaming-9 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/91f72923-a2dc-3dfd-8599-82e1eddcb139].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
.......................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-streaming-9] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-9-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-streaming-9
Warning: Permanently added 'compute.4035892865119074581' (ECDSA) to the list of known hosts.
20/02/28 12:33:24 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-streaming-9-m/10.128.15.196:8032
+ read line
+ echo application_1582893139077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565
application_1582893139077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565
++ echo application_1582893139077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565
++ sed 's/ .*//'
+ application_ids[$i]=application_1582893139077_0001
++ echo application_1582893139077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565
++ sed 's/.*beam-loadtests-java-portable-flink-streaming-9/beam-loadtests-java-portable-flink-streaming-9/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565'
Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-9-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-9'
3498ee428562b105422f8e7b74415add631056f0235aa538073e3c0f28574af6
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-streaming-9-m '--command=curl -s "http://beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1582893139077_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-982fe49c-44bf-4f88-a2d6-1121e5291da5"},{"key":"jobmanager.rpc.port","value":"41101"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1582893139077_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1582893139077_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-982fe49c-44bf-4f88-a2d6-1121e5291da5"},{"key":"jobmanager.rpc.port","value":"41101"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1582893139077_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.ad++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
dress","value":"beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=41101
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-9-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565 -L 41101:beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:41101 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-9-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565 -L 41101:beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:41101 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-9-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:43565 -L 41101:beam-loadtests-java-portable-flink-streaming-9-w-2.c.apache-beam-testing.internal:41101 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins3395950930550275201.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :runners:local-java:jar
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.IllegalStateException: GroupByKey cannot be applied to non-bounded PCollection in the GlobalWindow without a trigger. Use a Window.into or Window.triggering transform prior to GroupByKey.
	at org.apache.beam.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:156)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:226)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1596)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1485)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.loadTest(CombineLoadTest.java:134)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:96)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 18s
61 actionable tasks: 9 executed, 6 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/rye5ddowzgygs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #8

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/8/display/redirect?page=changes>

Changes:

[chadrik] Add attributes defined in operations.pxd but missing in operations.py

[robertwb] Minor FnAPI proto cleanups.

[je.ik] [BEAM-9273] Explicitly disable @RequiresTimeSortedInput on unsupported

[je.ik] [BEAM-9273] code review - to be squashed

[kcweaver] [BEAM-9212] fix zetasql struct exception

[kcweaver] [BEAM-9211] Spark reuse Flink portable jar test script

[kcweaver] test_pipeline_jar Use single jar arg for both Flink and Spark.

[iemejia] Pin Avro dependency in Python SDK to be consistent with Avro versioning

[apilloud] [BEAM-9311] ZetaSQL Named Parameters are case-insensitive

[github] Bump dataflow container version (#10861)

[github] [BEAM-8335] Update StreamingCache with new Protos (#10856)

[github] [BEAM-9317] Fix portable test executions to specify the beam_fn_api

[je.ik] [BEAM-9265] @RequiresTimeSortedInput respects allowedLateness

[github] [BEAM-9289] Improve performance for metrics update of samza runner

[github] = instead of -eq

[iemejia] [BEAM-6857] Classify unbounded dynamic timers tests in the

[iemejia] Exclude Unbounded PCollection tests from Flink Portable runner batch

[github] [BEAM-9317] Fix Dataflow tests to not perform SplittableDoFn expansion

[iemejia] [BEAM-9315] Allow multiple paths via HADOOP_CONF_DIR in

[github] Update container images used by Dataflow runner with unreleased SDKs.

[github] [BEAM-9314] Make dot output deterministic (#10864)

[ccy] [BEAM-9277] Fix exception when running in IPython notebook.

[github] Remove experimental parallelization (-j 8) flags from sphinx

[iemejia] [BEAM-9301] Checkout the hash of master instead of the branch in beam

[github] [BEAM-8399] Add --hdfs_full_urls option (#10223)

[iemejia] Fix typo on runners/extensions-java label for github PR autolabeler

[github] Merge pull request #10862: [BEAM-9320] Add AlwaysFetched annotation


------------------------------------------
[...truncated 73.09 KB...]
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-streaming-8 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/3f572013-2f39-3deb-8a15-68ed8fdb39dc].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
.......................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-streaming-8] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-8-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-streaming-8
Warning: Permanently added 'compute.3522865425407227466' (ECDSA) to the list of known hosts.
20/02/17 12:38:23 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-streaming-8-m/10.128.0.240:8032
+ read line
+ echo application_1581943041694_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817
application_1581943041694_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817
++ echo application_1581943041694_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817
++ sed 's/ .*//'
+ application_ids[$i]=application_1581943041694_0001
++ echo application_1581943041694_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817
++ sed 's/.*beam-loadtests-java-portable-flink-streaming-8/beam-loadtests-java-portable-flink-streaming-8/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817'
Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-8-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-8'
2732bc351a62985b40f24ae29adce9fe476149a564cde8bcadd5a74bb73fadae
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-streaming-8-m '--command=curl -s "http://beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581943041694_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-6d32e2bc-3ec7-4426-aeac-9c8e5f837ca3"},{"key":"jobmanager.rpc.port","value":"37375"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581943041694_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581943041694_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-6d32e2bc-3ec7-4426-aeac-9c8e5f837ca3"},{"key":"jobmanager.rpc.port","value":"37375"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581943041694_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.ad++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
dress","value":"beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=37375
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-8-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817 -L 37375:beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:37375 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-8-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817 -L 37375:beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:37375 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-8-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:46817 -L 37375:beam-loadtests-java-portable-flink-streaming-8-w-3.c.apache-beam-testing.internal:37375 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins112638241398306019.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:local-java:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar
> Task :runners:portability:java:compileJava
> Task :runners:portability:java:classes
> Task :runners:portability:java:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.IllegalStateException: GroupByKey cannot be applied to non-bounded PCollection in the GlobalWindow without a trigger. Use a Window.into or Window.triggering transform prior to GroupByKey.
	at org.apache.beam.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:156)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:226)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1596)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1485)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.loadTest(CombineLoadTest.java:134)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:96)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 17s
61 actionable tasks: 9 executed, 6 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/zuq6sulgt6vtq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #7

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/7/display/redirect?page=changes>

Changes:

[dpcollins] Move external PubsubIO hooks outside of PubsubIO.

[github] [BEAM-9188] CassandraIO split performance improvement - cache size of

[robertwb] Only cache first page of paginated state.

[robertwb] Perform bundle-level caching if no cache token is given.

[robertwb] [BEAM-8298] Support side input cache tokens.

[radoslaws] spotless fixes

[robertwb] fix continuation token iter

[robertwb] lint for side input tokens

[github] Rename "word" to "line" for better readability

[github] Rename "words" to "line" also in docs

[radoslaws] comments and tests

[suztomo] bigtable-client-core 1.13.0 and exclusion and gax

[robinyqiu] Cleanup ZetaSQLQueryPlanner and ExpressionConverter code

[suztomo] Controlling grpc-grpclb and grpc-core

[robertwb] Fix state cache test.

[robertwb] TODO about two-level caching.

[robertwb] CachingStateHandler unit test.

[github] "Upgrade" google-cloud-spanner version to 1.13.0

[github] Removing none instead of bare return

[michal.walenia] [BEAM-9226] Set max age of 3h for Dataproc Flink clusters

[je.ik] [BEAM-8550] @RequiresTimeSortedInput: working with legacy flink and

[kamil.wasilewski] Generate 100kB records in GroupByKey Load test 3

[robertwb] [BEAM-9227] Defer bounded source size estimation to the workers.

[chadrik] [BEAM-8271] Properly encode/decode StateGetRequest/Response

[github] [BEAM-8042] [ZetaSQL] Fix aggregate column reference (#10649)

[robertwb] test lint

[robertwb] Fix extending non-list.

[robertwb] Fix some missing (but unused) output_processor constructor arguments.

[chadrik] [BEAM-7746] Avoid errors about Unsupported operand types for >= ("int"

[robertwb] Fix flink counters test.

[github] [BEAM-8590] Support unsubscripted native types (#10042)

[github] Revert "[BEAM-9226] Set max age of 3h for Dataproc Flink clusters"

[radoslaws] spottless

[mxm] [BEAM-9132] Avoid logging misleading error messages during pipeline

[github] [BEAM-8889] Cleanup Beam to GCS connector interfacing code so it uses

[heejong] [BEAM-7961] Add tests for all runner native transforms and some widely

[github] [BEAM-9233] Support -buildmode=pie -ldflags=-w with unregistered Go

[github] [BEAM-9167] Metrics extraction refactoring. (#10716)

[kenn] Clarify exceptions in SQL modules

[github] Update Beam Python container release

[github] No longer reporting Lulls as errors in the worker.

[iemejia] [BEAM-9236] Mark missing Schema based classes and methods as

[iemejia] [BEAM-9236] Remove unneeded schema related class FieldValueSetterFactory

[iemejia] [BEAM-9236] Remove unused schema related class FieldValueGetterFactory

[iemejia] [BEAM-6857] Recategorize UsesTimerMap tests to ValidatesRunner

[hsuryawirawan] Update Beam Katas Java to use Beam version 2.18.0

[kamil.wasilewski] Remove some tests in Python GBK on Flink suite

[hsuryawirawan] Update Beam Katas Python to use Beam version 2.18.0

[kamil.wasilewski] [BEAM-9234] Avoid using unreleased versions of PerfKitBenchmarker

[github] Adding new source tests for Py BQ source (#10732)

[suztomo] Introducing google-http-client.version

[github] [BEAM-8280][BEAM-8629] Make IOTypeHints immutable (#10735)

[heejong] [BEAM-9230] Enable CrossLanguageValidateRunner test for Spark runner

[suztomo] Property google-api-client

[ehudm] [BEAM-8095] Remove no_xdist for test

[zyichi] Remove managing late data not supported by python sdk note

[echauchot] Embed audio podcasts players to webpage instead of links that play the

[iemejia] [BEAM-9236] Mark missing Schema based classes and methods as

[yoshiki.obata] [BEAM-9163] update sphinx_rtd_theme to newest

[iemejia] [BEAM-7310] Add support of Confluent Schema Registry for KafkaIO

[altay] Add CHANGES.md file

[robinyqiu] Support all ZetaSQL TIMESTAMP functions

[github] [BEAM-4150] Remove fallback case for coder not specified within

[github] [BEAM-9009] Add pytest-timeout plugin, set timeout (#10437)

[github] [BEAM-3221] Expand/clarify timestamp comments within

[boyuanz] Add new release 2.19.0 to beam website.

[boyuanz] Update beam 2.19.0 release blog

[ehudm] Convert repo.spring.io to use https + 1 other

[ehudm] [BEAM-9251] Fix :sdks:java:io:kafka:updateOfflineRepository

[gleb] Fix AvroIO javadoc for deprecated methods

[github] [BEAM-5605] Migrate splittable DoFn methods to use "new" DoFn style

[github] [BEAM-6703] Make Dataflow ValidatesRunner test use Java 11 in test

[daniel.o.programmer] [BEAM-3301] Small cleanup to FullValue code.

[apilloud] [BEAM-8630] Add logical types, make public

[github] [BEAM-9037] Instant and duration as logical type (#10486)

[github] [BEAM-2645] Define the display data model type

[kamil.wasilewski] [BEAM-9175] Add yapf autoformatter

[kamil.wasilewski] [BEAM-9175] Yapf everywhere!

[kamil.wasilewski] [BEAM-9175] Fix pylint issues

[kamil.wasilewski] [BEAM-9175] Add pre-commit Jenkins job

[kamil.wasilewski] [BEAM-9175] Disable bad-continuation check in pylint

[amyrvold] [BEAM-9261] Add LICENSE and NOTICE to Docker images

[github] [BEAM-8951] Stop using nose in load tests (#10435)

[robertwb] [BEAM-7746] Cleanup historical DnFnRunner-as-Receiver cruft.

[robertwb] [BEAM-8976] Initalize logging configuration at a couple of other entry

[chadrik] [BEAM-7746] Add typing for try_split

[zyichi] Fix race exception in python worker status thread dump

[iemejia] [BEAM-9264] Upgrade Spark to version 2.4.5

[hsuryawirawan] Update Beam Katas Java to use Beam version 2.19.0

[hsuryawirawan] Update Beam Katas Python to use Beam version 2.19.0

[hsuryawirawan] Update Beam Katas Python on Stepik

[hsuryawirawan] Update Built-in IOs task type to theory

[hsuryawirawan] Update Beam Katas Java on Stepik

[kamil.wasilewski] Fix method name in Combine and coGBK tests

[github] [BEAM-3453] Use project specified in pipeline_options when creating

[robertwb] [BEAM-9266] Remove unused fields from provisioning API.

[github] [BEAM-9262] Clean-up endpoints.proto to a stable state (#10789)

[lcwik] [BEAM-3595] Migrate to "v1" URNs for standard window fns.

[daniel.o.programmer] [BEAM-3301] (Go SDK) Adding restriction plumbing to graph construction.

[robertwb] Remove one more reference to provision resources.

[github] Merge pull request #10766: [BEAM-4461] Add Selected.flattenedSchema

[robertwb] Reject unsupported WindowFns and Window types.

[github] Merge pull request #10804: [BEAM-2535] Fix timer map

[github] Merge pull request #10627:[BEAM-2535] Support outputTimestamp and

[iemejia] [BEAM-7092] Fix invalid import of Guava coming from transitive Spark dep

[alex] [BEAM-9241] Fix inconsistent proto nullability

[kamil.wasilewski] Move imports and variables out of global namespace

[iemejia] [BEAM-9281] Update commons-csv to version 1.8

[iemejia] [website] Update Java 11 and Spark roadmap

[apilloud] [BEAM-8630] Validate prepared expression on expand

[github] [BEAM-9268] SpannerIO: Add more documentation and warnings for unknown

[iemejia] [BEAM-9231] Add Experimental(Kind.PORTABILITY) and tag related classes

[iemejia] [BEAM-9231] Tag SplittableDoFn related classes/methods as Experimental

[iemejia] [BEAM-9231] Make Experimental annotations homogeneous in

[iemejia] [BEAM-9231] Untag Experimental/Internal classes not needed to write

[iemejia] [BEAM-9231] Tag beam-sdks-java-core internal classes as Internal

[iemejia] [BEAM-9231] Tag DoFn.OnTimerContext as Experimental(Kind.TIMERS)

[iemejia] [BEAM-9231] Tag Experimental/Internal packages in beam-sdks-java-core

[iemejia] [BEAM-9231] Tag Experimental/Internal packages in IOs and extensions

[iemejia] [BEAM-9231] Tag public but internal IOs and extensions classes as

[yoshiki.obata] [BEAM-7198] rename ToStringCoder to ToBytesCoder for proper

[iemejia] [BEAM-9160] Update AWS SDK to support Pod Level Identity

[yoshiki.obata] [BEAM-7198] add comment

[ankurgoenka] [BEAM-9290] Support runner_harness_container_image in released python

[boyuanz] Move ThreadsafeRestrictionTracker and RestrictionTrackerView out from

[github] Remove tables and refer to dependency locations in code (#10745)

[ehudm] fix lint

[valentyn] Cleanup MappingProxy reducer since dill supports it natively now.

[suztomo] beam-linkage-check.sh

[iemejia] Enable probot autolabeler action to label github pull requests

[iemejia] Remove prefixes in autolabeler configuration to improve readability

[iemejia] [BEAM-9160] Removed WebIdentityTokenCredentialsProvider explicit json

[suztomo] copyright

[yoshiki.obata] [BEAM-7198] fixup: reformatted with yapf

[github] [BEAM-3221] Clarify documentation for StandardTransforms.Primitives,

[aromanenko.dev] [BEAM-9292] Provide an ability to specify additional maven repositories

[aromanenko.dev] [BEAM-9292] KafkaIO: add io.confluent repository to published POM

[github] [BEAM-8201] Add other endpoint fields to provision API. (#10839)

[github] [BEAM-9269] Add commit deadline for Spanner writes. (#10752)

[github] [AVRO-2737] Exclude a buggy avro version from requirements spec.

[iemejia] Refine labels/categories for PR autolabeling

[github] Update roadmap page for python 3 support

[iemejia] [BEAM-9160] Removed WebIdentityTokenCredentialsProvider explicit json

[iemejia] Remove unused ReduceFnRunnerHelper class

[iemejia] Do not set options.filesToStage in case of spark local execution in

[iemejia] Do not set options.filesToStage in case of spark local execution in

[github] [BEAM-6522] [BEAM-7455] Unskip Avro IO tests that are now passing.

[github] [BEAM-5605] Convert all BoundedSources to SplittableDoFns when using

[github] [BEAM-8758] Google-cloud-spanner upgrade to 1.49.1 (#10765)

[github] Ensuring appropriate write_disposition and create_disposition for jobs

[github] [BEAM-3545] Return metrics as MonitoringInfos (#10777)

[github] Modify the TestStreamFileRecord to use TestStreamPayload events.

[iemejia] [BEAM-9280] Update commons-compress to version 1.20


------------------------------------------
[...truncated 73.82 KB...]
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-streaming-7 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/e4f1fe7e-8d12-393b-94e6-b95105b11326].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
.......................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-streaming-7] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-7-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-streaming-7
Warning: Permanently added 'compute.2682972823939132776' (ECDSA) to the list of known hosts.
20/02/14 12:35:44 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-streaming-7-m/10.128.0.164:8032
+ read line
+ echo application_1581683685065_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143
application_1581683685065_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143
++ echo application_1581683685065_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143
++ sed 's/ .*//'
+ application_ids[$i]=application_1581683685065_0001
++ echo application_1581683685065_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143
++ sed 's/.*beam-loadtests-java-portable-flink-streaming-7/beam-loadtests-java-portable-flink-streaming-7/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143'
Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-7-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-7'
3486f21fb857dfd18bf9022026435b4564902f08347fececfb51f6851286716b
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-streaming-7-m '--command=curl -s "http://beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581683685065_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-9c3c6a14-63d4-4a22-80b8-8b476e847064"},{"key":"jobmanager.rpc.port","value":"36033"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581683685065_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581683685065_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-9c3c6a14-63d4-4a22-80b8-8b476e847064"},{"key":"jobmanager.rpc.port","value":"36033"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581683685065_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=36033
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-7-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143 -L 36033:beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:36033 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-7-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143 -L 36033:beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:36033 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-7-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:40143 -L 36033:beam-loadtests-java-portable-flink-streaming-7-w-4.c.apache-beam-testing.internal:36033 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins6769699700540123965.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:shadowJar

> Task :sdks:java:io:synthetic:compileJava
Note: <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:io:synthetic:classes
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kinesis:compileJava
> Task :sdks:java:io:kinesis:classes
> Task :sdks:java:io:kinesis:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.IllegalStateException: GroupByKey cannot be applied to non-bounded PCollection in the GlobalWindow without a trigger. Use a Window.into or Window.triggering transform prior to GroupByKey.
	at org.apache.beam.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:156)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:226)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1596)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1485)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.loadTest(CombineLoadTest.java:134)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:96)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 15s
61 actionable tasks: 11 executed, 4 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/gsguzkjvjsboe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #6

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/6/display/redirect?page=changes>

Changes:

[kmj] Update BQ Storage API documentation

[chadrik] [BEAM-7746] Silence a bunch of errors about "Cannot instantiate abstract

[mxm] [BEAM-9161] Ensure non-volatile access of field variables by processing

[github] Merge pull request #10680 from Indefinite retries to wait for a BQ Load

[chamikara] Fixes an issue where FileBasedSink may suppress exceptions.

[github] [BEAM-7847] enabled to generate SDK docs with Python3 (#10141)

[ankurgoenka] [BEAM-9220] Adding argument use_runner_v2 for dataflow unified worker

[suztomo] Linkage Checker 1.1.3


------------------------------------------
[...truncated 72.61 KB...]
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                -
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-streaming-6 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/ad8ee15a-316b-3d6b-9dac-8d2aae7a6514].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
.......................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-streaming-6] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-6-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-streaming-6
Warning: Permanently added 'compute.2375112437195043017' (ECDSA) to the list of known hosts.
20/01/30 12:40:32 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-streaming-6-m/10.128.0.198:8032
+ read line
+ echo application_1580387975132_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913
application_1580387975132_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913
++ echo application_1580387975132_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913
++ sed 's/ .*//'
+ application_ids[$i]=application_1580387975132_0001
++ echo application_1580387975132_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913
++ sed 's/.*beam-loadtests-java-portable-flink-streaming-6/beam-loadtests-java-portable-flink-streaming-6/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913'
Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-6-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-6'
1475caec41ec9be1b878d0374d44dfa830be053fbc33be54c9173589658cdee1
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-streaming-6-m '--command=curl -s "http://beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580387975132_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-9b8fd9bc-a9f7-41a6-9062-82d70a4f6d18"},{"key":"jobmanager.rpc.port","value":"45853"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580387975132_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580387975132_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-9b8fd9bc-a9f7-41a6-9062-82d70a4f6d18"},{"key":"jobmanager.rpc.port","value":"45853"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580387975132_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
+ local jobmanager_rpc_port=45853
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-6-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913 -L 45853:beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:45853 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-6-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913 -L 45853:beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:45853 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-6-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:38913 -L 45853:beam-loadtests-java-portable-flink-streaming-6-w-1.us-central1-a.c.apache-beam-testing.internal:45853 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins1456970003222714176.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :runners:local-java:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :runners:portability:java:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :runners:direct-java:shadowJar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.IllegalStateException: GroupByKey cannot be applied to non-bounded PCollection in the GlobalWindow without a trigger. Use a Window.into or Window.triggering transform prior to GroupByKey.
	at org.apache.beam.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:156)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:226)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:473)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1596)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1485)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.loadTest(CombineLoadTest.java:134)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:96)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9s
61 actionable tasks: 8 executed, 7 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/z67yu2ev24lf4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #5

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/5/display/redirect?page=changes>

Changes:

[iambruceactor] added more meetups

[suztomo] Google-cloud-clients to use 2019 versions

[lcwik] [BEAM-8298] Fully specify the necessary details to support side input

[chadrik] [BEAM-7746] Introduce a protocol to handle various types of partitioning

[iemejia] [BEAM-6957] Enable Counter/Distribution metrics tests for Portable Spark

[kcweaver] [BEAM-9200] fix portable jar test version property

[iemejia] [BEAM-9204] Refactor HBaseUtils methods to depend on Ranges

[iemejia] [BEAM-9204] Fix HBase SplitRestriction to be based on provided Range

[echauchot] [BEAM-9205] Add ValidatesRunner annotation to the MetricsPusherTest

[echauchot] [BEAM-9205] Fix validatesRunner tests configuration in spark module

[jbonofre] [BEAM-7427] Refactore JmsCheckpointMark to be usage via Coder

[iemejia] [BEAM-7427] Adjust JmsIO access levels and other minor fixes

[pabloem] Merge pull request #10346 from [BEAM-7926] Data-centric Interactive

[chamikara] Fix Spanner auth endpoints

[chadrik] [BEAM-7746] Stop automatically creating staticmethods in register_urn


------------------------------------------
[...truncated 73.05 KB...]
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-streaming-5 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/0c750709-08cc-3c2a-9134-d4bf91d750bb].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
...........................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-streaming-5] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-5-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-streaming-5
Warning: Permanently added 'compute.1319237140275316081' (ECDSA) to the list of known hosts.
20/01/29 12:36:28 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-streaming-5-m/10.128.0.182:8032
+ read line
+ echo application_1580301327971_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909
application_1580301327971_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909
++ echo application_1580301327971_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909
++ sed 's/ .*//'
+ application_ids[$i]=application_1580301327971_0001
++ sed 's/.*beam-loadtests-java-portable-flink-streaming-5/beam-loadtests-java-portable-flink-streaming-5/'
++ echo application_1580301327971_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909'
Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-5-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-5'
8cce79bd3494da44955f4dd261c99abb9c413605aec1dd0ef6b068f2ed888509
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-streaming-5-m '--command=curl -s "http://beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580301327971_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-5b1a7763-76ff-4682-9927-e9baf07285b8"},{"key":"jobmanager.rpc.port","value":"35241"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580301327971_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580301327971_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-5b1a7763-76ff-4682-9927-e9baf07285b8"},{"key":"jobmanager.rpc.port","value":"35241"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580301327971_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=35241
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-5-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909 -L 35241:beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:35241 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-5-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909 -L 35241:beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:35241 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-5-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:43909 -L 35241:beam-loadtests-java-portable-flink-streaming-5-w-5.us-central1-a.c.apache-beam-testing.internal:35241 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins4059123383787980564.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :runners:direct-java:shadowJar

> Task :sdks:java:io:synthetic:compileJava
Note: <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:io:synthetic:classes
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kinesis:compileJava
> Task :sdks:java:io:kinesis:classes
> Task :sdks:java:io:kinesis:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.IllegalStateException: GroupByKey cannot be applied to non-bounded PCollection in the GlobalWindow without a trigger. Use a Window.into or Window.triggering transform prior to GroupByKey.
	at org.apache.beam.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:156)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:226)
	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:473)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1596)
	at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1485)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.loadTest(CombineLoadTest.java:134)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:96)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14s
61 actionable tasks: 11 executed, 4 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/4e657qlw6kc34

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #4

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/4/display/redirect?page=changes>

Changes:

[kcweaver] [DO NOT MERGE][BEAM-9177] Update Flink runner webpage for 2.18

[kcweaver] Update Beam version chart

[robinyqiu] Turn on BeamZetaSqlCalcRel

[pawel.pasterz] [BEAM-8941] Implement simple DSL for load tests

[iemejia] [website] Add warning on Beam 2.18.0 blog post for Avro 1.9.0 users

[github] [BEAM-9183, BEAM-9026] Initialize and cleanup the state of

[tvalentyn] [BEAM-9184] Add ToSet combiner (#10636)

[github] Fixing Lint

[github] [BEAM-9201] Release scripts fixes: run_rc_validation.sh,

[altay] Change Dataflow Python containers

[angoenka] [BEAM-8626] Implement status fn api handler in python sdk (#10598)

[tvalentyn] [BEAM-9186] Allow injection of custom equality function. (#10637)


------------------------------------------
[...truncated 85.69 KB...]
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-streaming-4 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/8ab7121d-e6d2-3ccc-b49c-1e767ef0557e].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
......................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-streaming-4] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-4-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-streaming-4
Warning: Permanently added 'compute.6543196662580221212' (ECDSA) to the list of known hosts.
20/01/28 12:43:56 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-streaming-4-m/10.128.0.124:8032
+ read line
+ echo application_1580215379496_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031
application_1580215379496_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031
++ sed 's/ .*//'
++ echo application_1580215379496_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031
+ application_ids[$i]=application_1580215379496_0001
++ echo application_1580215379496_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031
++ sed 's/.*beam-loadtests-java-portable-flink-streaming-4/beam-loadtests-java-portable-flink-streaming-4/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031'
Using Yarn Application master: beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-4-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-4'
da6dfe63bdbd6f5462db069741095469fc685e976c6c9a781ca948b6c65b7725
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-streaming-4-m '--command=curl -s "http://beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580215379496_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-adce1986-8e43-427c-bb11-f780457d6851"},{"key":"jobmanager.rpc.port","value":"36083"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580215379496_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580215379496_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-adce1986-8e43-427c-bb11-f780457d6851"},{"key":"jobmanager.rpc.port","value":"36083"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580215379496_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
+ local jobmanager_rpc_port=36083
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-4-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031 -L 36083:beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:36083 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-4-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031 -L 36083:beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:36083 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-streaming-4-m -- -L 8081:beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:44031 -L 36083:beam-loadtests-java-portable-flink-streaming-4-w-3.us-central1-a.c.apache-beam-testing.internal:36083 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins7725079212212448308.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_streaming_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_streaming_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --inputWindowDurationSec=1200 --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:portability:java:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console...
Exception in thread "main" java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.ClassNotFoundException: org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:98)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:99)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.ClassNotFoundException: org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1928)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:90)
	... 3 more
Caused by: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.ClassNotFoundException: org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource
	at org.apache.beam.runners.portability.JobServicePipelineResult.propagateErrors(JobServicePipelineResult.java:165)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:110)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
	at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 24s
61 actionable tasks: 8 executed, 7 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/ieznby2xnr2i4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #3

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/3/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8525] Support Const base in binary_subscr

[ehudm] Do not perform test on Py2.7

[echauchot] [BEAM-9065] Reset MetricsContainerStepMapAccumulator upon initialization

[kamil.wasilewski] [BEAM-8108] Allow run_chicago.sh to take variable argument number

[kamil.wasilewski] [BEAM-8108] Use _ReadFromBigQuery transform

[kamil.wasilewski] [BEAM-8108] Create Jenkins job that runs Chicago Taxi Example on Flink

[kamil.wasilewski] [BEAM-8108] Rewrite Python print statement in dataproc init action

[github] Update release guide with extended information.

[github] Add link to mass comment script

[github] Fix typo

[ankurgoenka] [BEAM-9002] Add test_flatten_same_pcollections to fnapi runner

[github] Address comments

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[iemejia] [BEAM-5086] Update kudu-client to version 1.11.1

[valentyn] Add version guards to requirements file for integration tests.

[robertwb] Automatically convert to with pipeline syntax.

[robertwb] Quick pass through failed auto-conversions.

[robertwb] Automatic conversion of more pipelines.

[robertwb] Fix lint and tests due to autoconversion.

[robertwb] A couple more conversions.

[robertwb] Fix lint and tests due to autoconversion.

[robertwb] Return non-None result for Dataflow dry run.

[robertwb] Fix lint and tests due to autoconversion.

[hannahjiang] [BEAM-9084] fix Java spotless

[kamil.wasilewski] [BEAM-8939] A bash script that cancels stale dataflow jobs

[chadrik] [BEAM-7746] Fix a typing issue where SourceBase was assumed to have a

[marek.simunek] [BEAM-9123] HadoopResourceId returns wrong directoryName bugfix

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security issues

[boyuanz] Moving to 2.20.0-SNAPSHOT on master branch.

[github] Update twine install details.

[crites] Changes watermark advance from 1001 to 1000 since Dataflow TestStream

[lukecwik] [BEAM-8676] sdks/java: gax and grpc upgrades (#10554)

[lukecwik] [BEAM-9030] Migrate Beam to use beam-vendor-grpc-1_26_0 (#10578)

[lcwik] [BEAM-9030] Align version of protoc/protoc-gen-grpc-java to vendored

[kirillkozlov] Create initial DataStoreV1Table

[kirillkozlov] Added tests

[kirillkozlov] Implement getTableStatistics

[kirillkozlov] buildIOWriter

[kirillkozlov] Read other types

[kirillkozlov] Better conversion, support complex types

[kirillkozlov] Store DataStore key as VARBINARY

[kirillkozlov] Wrap DoFns in PTransforms

[kirillkozlov] Table property for specifying key field

[kirillkozlov] JavaDoc

[kirillkozlov] Infer schema for RowToEntity

[kirillkozlov] Better conversion performance

[kirillkozlov] Mark Table as `Internal` and `Experimental`

[kirillkozlov] Review changes

[boyuanz] Exclude testOutputTimestamp from flink streaming tests.

[lukecwik] [BEAM-7951] Supports multiple inputs/outputs for wire coder settings.

[mxm] [BEAM-9116] Limit the number of past invocations stored in JobService

[kirillkozlov] Add IT that does not rely on SQL

[lukecwik] [BEAM-9124] Linkage Checker 1.1.2 to use Maven Central HTTPS URL

[robertwb] lint, reviewer comments

[relax] Merge pull request #10316: [BEAM-6857] Support Dynamic Timers

[robertwb] lint

[github] Merge pull request #10577 from Adding Python test for ReadFromBigQuery

[kirillkozlov] fix style

[github] [BEAM-9127] Fix output type declaration in xlang wordcount. (#10605)

[robertwb] fix merge

[apilloud] [BEAM-9140] Upgrade to ZetaSQL 2020.01.1

[angoenka] [BEAM-8625] Implement servlet for exposing sdk harness statuses in Da…

[iemejia] [BEAM-9143] Make RedisIO code follow standard Beam conventions

[iemejia] [BEAM-9143] Add withOutputParallelization to RedisIO.Read/ReadAll

[suztomo] Beam's own TimestampConversion

[suztomo] Link in Javadoc

[ehudm] Migrate HDFS IT to use tox env.

[ehudm] Fix for py2

[lukecwik] [BEAM-8695] Upgrade google-http-client to 1.34.0 (#10614)

[rohde.samuel] Make apply_TestStream not a test b/c Nose thinks it is a test

[chamikara] [BEAM-7246] Add Google Spanner IO Read on Python SDK (#9606)

[boyuanz] [BEAM-9151] Fix misconfigured legacy dataflow tests.

[hannahjiang] [BEAM-7861] update documentation

[github] Merge pull request #10622: [BEAM=6857] Fix timermap test to not use

[sunjincheng121] [BEAM-9153] Fix release guide heading level

[kamil.wasilewski] Report status code 0 when no stale jobs are found

[hsuryawirawan] Update to Python 3.7 and upgrade to Beam v2.17.0

[hsuryawirawan] Change "Built-in IOs" task type to "theory"

[hsuryawirawan] Remove "run_common_tests()" in all tests.py to make it work with Python

[hsuryawirawan] Update the course on Stepik

[hsuryawirawan] Add back the license header

[github] Fix typo in 2017-02-13-stateful-processing.md

[github] cleanup typo

[github] [BEAM-9120] Make WordCountsMatcher extend

[github] Revert "Merge pull request #10582 for [INFRA-19670] Add .asf.yaml for

[angoenka] [BEAM-9122] Add uses_keyed_state step property in python dataflow run…

[github] [BEAM-9120] Make BigqueryMatcher extend TypeSafeMatcher<TableAndQuery>

[github] [BEAM-9120] Make FileChecksumMatcher extend TypeSafeMatcher<ShardedFile>

[pabloem] Merge pull request #10459 from [BEAM-9029]Fix two bugs in Python SDK S3

[lcwik] [BEAM-3419] Support iterable on Dataflow runner when using the unified

[ehudm] [BEAM-9168] Temporary fix for RunnerAPIPTransformHolder usage

[keijiy] Fix comments on bigquery.py * beam.io.gcp.WriteToBigQuery ->

[github] Merge pull request #10534: Beam-2535: Support timer output timestamps in

[github] [BEAM-8992]  ignore Go Vet failures. (#10661)

[github] [BEAM-4725] Preallocate CoGBK value buffers (#10660)

[kirillkozlov] Use custom escape method

[github] typehint fixes to DoOutputsTuple (#10494)

[github] [BEAM-3419, BEAM-9173] Add TODO comment

[github] [BEAM-9167] Reduce Go SDK metric overhead (#10654)

[iemejia] [BEAM-9172] Add missing parameter to Nexmark CI execution for Flink

[github] Chnage dataflow container version

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[iemejia] Make Spark use of batch based invocation in Nexmark CI jobs cleaner

[kirillkozlov] Inline, link JIRA

[ehudm] Update beam website for 2.18.0

[apilloud] [BEAM-9027] Unparse LIKE as binary operator

[github] Blog post for release 2.18.0 (#10575)

[kcweaver] upgrade auto-value to version 1.7

[kcweaver] [BEAM-9149] Add SQL query parameters to public API and enable positional

[bhulette] [BEAM-9072] [SQL] Primitive types should fall through (#10681)

[elias.djurfeldt] Added valueprovider support for Datastore query namespaces

[iemejia] [BEAM-9170] Unify Jenkins job names related to Java 11 testing

[github] Add a minor comment.

[aaltay] Bump tensorflow from 1.14.0 to 1.15.0 in /sdks/python/container (#10392)

[github] [BEAM-9093] Log invalid overwrites in pipeline options (#10613)

[github] [BEAM-8492] Allow None, Optional return hints for DoFn.process and


------------------------------------------
[...truncated 40.08 KB...]
0ca7f54856c0: Waiting
1f59a4b2e206: Waiting
cc9df27874c1: Pushed
f4d0cfc445db: Pushed
1919f6cf82c6: Pushed
a6ded049566a: Layer already exists
e7fe5541de5f: Layer already exists
03ff63c55220: Layer already exists
1f59a4b2e206: Layer already exists
bee1e39d7c3a: Layer already exists
0ca7f54856c0: Layer already exists
ebb9ae013834: Layer already exists
6c1acb506b04: Pushed
c8cae99f371f: Pushed
031e02db837f: Pushed
latest: digest: sha256:6784d996d243ec7e9ff738d3b0bab12aeb6d444f48367d37020dee98da69ae16 size: 3056
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability -Pdocker-tag=latest :runners:flink:1.9:job-server-container:docker
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:flink:1.9:job-server:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:flink:1.9:job-server-container:dockerClean UP-TO-DATE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :runners:flink:1.9:processResources
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.9:compileJava FROM-CACHE
> Task :runners:flink:1.9:classes
> Task :runners:flink:1.9:jar
> Task :runners:flink:1.9:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.9:job-server:classes UP-TO-DATE
> Task :runners:flink:1.9:job-server:shadowJar
> Task :runners:flink:1.9:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.9:job-server-container:dockerPrepare
> Task :runners:flink:1.9:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 2s
58 actionable tasks: 16 executed, 6 from cache, 36 up-to-date

Publishing build scan...
https://gradle.com/s/cwmjpqypriu4u

[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins3550957389153132389.sh
+ echo 'Tagging image...'
Tagging image...
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins2849286424700381525.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins7401784760767062727.sh
+ echo 'Pushing image...'
Pushing image...
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins7233967731295546707.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server]
7aaa42969267: Preparing
ce1826c22629: Preparing
a4ea34e70eb2: Preparing
a6ded049566a: Preparing
e7fe5541de5f: Preparing
03ff63c55220: Preparing
bee1e39d7c3a: Preparing
1f59a4b2e206: Preparing
0ca7f54856c0: Preparing
ebb9ae013834: Preparing
bee1e39d7c3a: Waiting
1f59a4b2e206: Waiting
0ca7f54856c0: Waiting
ebb9ae013834: Waiting
03ff63c55220: Waiting
a6ded049566a: Layer already exists
e7fe5541de5f: Layer already exists
03ff63c55220: Layer already exists
bee1e39d7c3a: Layer already exists
1f59a4b2e206: Layer already exists
0ca7f54856c0: Layer already exists
ebb9ae013834: Layer already exists
7aaa42969267: Pushed
a4ea34e70eb2: Pushed
ce1826c22629: Pushed
latest: digest: sha256:506d390abc1a258d59c29bca95c686b5f2227b48a6ce90c5911708c0423a45ad size: 2427
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
CLUSTER_NAME=streaming-3
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
FLINK_NUM_WORKERS=16
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/streaming-3
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins5403906095691035055.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe /tmp/jenkins2327614773919597284.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=streaming-3-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                -
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=17
+ gcloud dataproc clusters create streaming-3 --region=global --num-workers=17 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/75e6e668-082b-3761-8016-47ce6f9f138b].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
.....................................................................................................................................................................................done.
ERROR: Create cluster failed!
ERROR: gcloud crashed (AttributeError): 'Operation' object has no attribute 'details'

If you would like to report this issue, please run the following command:
  gcloud feedback

To check gcloud for common problems, please run the following command:
  gcloud info --run-diagnostics
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org