You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/01/29 13:13:58 UTC

Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Batch #5

See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/5/display/redirect?page=changes>

Changes:

[iambruceactor] added more meetups

[suztomo] Google-cloud-clients to use 2019 versions

[lcwik] [BEAM-8298] Fully specify the necessary details to support side input

[chadrik] [BEAM-7746] Introduce a protocol to handle various types of partitioning

[iemejia] [BEAM-6957] Enable Counter/Distribution metrics tests for Portable Spark

[kcweaver] [BEAM-9200] fix portable jar test version property

[iemejia] [BEAM-9204] Refactor HBaseUtils methods to depend on Ranges

[iemejia] [BEAM-9204] Fix HBase SplitRestriction to be based on provided Range

[echauchot] [BEAM-9205] Add ValidatesRunner annotation to the MetricsPusherTest

[echauchot] [BEAM-9205] Fix validatesRunner tests configuration in spark module

[jbonofre] [BEAM-7427] Refactore JmsCheckpointMark to be usage via Coder

[iemejia] [BEAM-7427] Adjust JmsIO access levels and other minor fixes

[pabloem] Merge pull request #10346 from [BEAM-7926] Data-centric Interactive

[chamikara] Fix Spanner auth endpoints

[chadrik] [BEAM-7746] Stop automatically creating staticmethods in register_urn


------------------------------------------
[...truncated 72.99 KB...]
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-batch-5 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/791401ed-8682-3f59-8223-d95b633d755b].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
......................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-batch-5] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-batch-5
Warning: Permanently added 'compute.4413311484161737698' (ECDSA) to the list of known hosts.
20/01/29 12:42:30 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-batch-5-m/10.128.0.90:8032
+ read line
+ echo application_1580301682932_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779
application_1580301682932_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779
++ echo application_1580301682932_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779
++ sed 's/ .*//'
+ application_ids[$i]=application_1580301682932_0001
++ echo application_1580301682932_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779
++ sed 's/.*beam-loadtests-java-portable-flink-batch-5/beam-loadtests-java-portable-flink-batch-5/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779'
Using Yarn Application master: beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-batch-5'
bf6d9a16e805ef405c55d568954c2ab8e381bebac694b82c2f23d54fa0bfa9a6
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-batch-5-m '--command=curl -s "http://beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580301682932_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-284e6819-944f-4584-a730-42dfae65c3b8"},{"key":"jobmanager.rpc.port","value":"35959"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580301682932_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1580301682932_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-284e6819-944f-4584-a730-42dfae65c3b8"},{"key":"jobmanager.rpc.port","value":"35959"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1580301682932_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=35959
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m -- -L 8081:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 -L 35959:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:35959 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m -- -L 8081:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 -L 35959:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:35959 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-5-m -- -L 8081:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:36779 -L 35959:beam-loadtests-java-portable-flink-batch-5-w-3.us-central1-a.c.apache-beam-testing.internal:35959 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1026063840166715975.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_batch_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_batch_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=false --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar
> Task :sdks:java:io:kinesis:compileJava
> Task :sdks:java:io:kinesis:classes
> Task :sdks:java:io:kinesis:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console...
Load test results for test (ID): a1f68f7a-7f40-4c8c-959a-86b32f49eab6 and timestamp: 2020-01-29T12:42:53.067000000Z:
                 Metric:                    Value:
             runtime_sec                  1200.452
       total_bytes_count                     2.0E9

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 21m 9s
61 actionable tasks: 10 executed, 5 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/3aynkctmo24xw

[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3795386177324874177.sh
+ echo Changing number of workers to 16
Changing number of workers to 16
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
FLINK_NUM_WORKERS=16

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4596650120239358043.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh restart
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-java-portable-flink-batch-5-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ restart
+ delete
+ gcloud dataproc clusters delete beam-loadtests-java-portable-flink-batch-5 --region=global --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/d8d222ba-f485-344b-8a6c-c5cce2d41d7b].
Waiting for cluster deletion operation...
...........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.delete) Operation [projects/apache-beam-testing/regions/global/operations/d8d222ba-f485-344b-8a6c-c5cce2d41d7b] timed out.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Java_Combine_Portable_Flink_Batch #11

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/11/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Batch #10

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/10/display/redirect?page=changes>

Changes:

[git] Remove optionality and add sensible defaults to PubsubIO builders.

[jkai] [BEAM-8331] rewrite the calcite JDBC urls

[boyuanz] Update verify_release_build script to run python tests with dev version.

[robertwb] Supporting infrastructure for dataframes on beam.

[robertwb] Basic deferred data frame implementation.

[robertwb] yapf, py2

[filiperegadas] Add BigQuery useAvroLogicalTypes option

[filiperegadas] fixup! Add BigQuery useAvroLogicalTypes option

[jvilcek] [BEAM-9360] Fix equivalence check for FieldType

[github] typings and docs for expressions.py

[chamikara] Logs BQ insert failures

[iemejia] [BEAM-9384] Add SchemaRegistry.getSchemaCoder to get SchemaCoders for

[lcwik] [BEAM-9397] Pass all but output receiver parameters to start

[kcweaver] [BEAM-9401] bind Flink MiniCluster to localhost

[sunjincheng121] [BEAM-9288] Not bundle conscrypt in gRPC vendor

[mxm] [BEAM-9345] Fix source of test flakiness in FlinkSubmissionTest

[kamil.wasilewski] Add integration test for AnnotateImage transform

[github] Add integration test for AnnotateText transform (#10977)

[chadrik] [BEAM-9405] Fix post-commit error about create_job_service

[chadrik] more typing fixes

[chadrik] Fix typing issue with python 3.5.2

[chadrik] fixes

[chadrik] Address more issues discovered after rebase

[chadrik] Improve the idiom used for conditional imports

[chadrik] Fix more issues

[chadrik] Update to latest mypy version

[amaliujia] Moving to 2.21.0-SNAPSHOT on master branch.

[github] [BEAM-8487] Handle nested forward references (#10932)

[github] [BEAM-9287] Add Postcommit tests for dataflow runner v2  (#10998)

[chadrik] [BEAM-7746] Runtime change to timestamp/duration equality

[github] Adds DisplayData for StateSpecs used by stateful ParDos

[iemejia] Fix non correctly formatted class in sdks/java/core

[iemejia] [BEAM-9342[ Update bytebuddy to version 1.10.8

[aromanenko.dev] [BEAM-8925] Tika version update to 1.23

[12602502+Ardagan] [BEAM-8327] Override Gradle cache for community metrics prober

[ehudm] Reduce warnings in pytest runs.

[heejong] [BEAM-9415] fix postcommit xvr tests

[github] Merge pull request #10968 from [BEAM-9381] Adding display data to

[github] [BEAM-8335] Add PCollection to DataFrame logic for InteractiveRunner.

[robertwb] Remove excessive logging.

[github] [BEAM-2939] Java UnboundedSource SDF wrapper (#10897)

[iemejia] [website] Update link to environment_type (SDK harness configuration)

[iemejia] Fix typo on python code

[kamil.wasilewski] Fix: skip test if GCP dependencies are not installed

[fernandodiaz] [BEAM-9424] Allow grouping by LogicalType

[github] Revert "[BEAM-8335] Add PCollection to DataFrame logic for

[echauchot] Add metrics export to documentation on the website.

[github] [BEAM-8382] Add rate limit policy to KinesisIO.Read (#9765)

[lcwik] [BEAM-9288] Bump version number vendored gRPC build.

[chadrik] [BEAM-9274] Support running yapf in a git pre-commit hook

[rohde.samuel] [BEAM-8335] Add PCollection to Dataframe logic for InteractiveRunner.

[github] [BEAM-8575] Modified trigger test to work for different runners.

[github] [BEAM-9413] fix beam_PostCommit_Py_ValCon (#11023)

[rohde.samuel] ReverseTestStream Implementation

[github] Update lostluck's info on the Go SDK roadmap

[suztomo] Google-cloud-bigquery 1.108.0

[github] [BEAM-9432] Move expansion service into its own project. (#11035)

[ehudm] [BEAM-3713] Remove nosetests from tox.ini

[github] Merge pull request #11025: [BEAM-6428] Improve select performance with

[github] Switch contact email to apache.org.

[github] [BEAM-6374] Emit PCollection metrics from GoSDK (#10942)

[amaliujia] [BEAM-9288] Not bundle conscrypt in gRPC vendor in META-INF/

[kcweaver] [BEAM-9448] Fix log message for job server cache.

[github] Update container image tags used by Dataflow runner for Beam master

[github] [BEAM-8328] Disable community metrics integration test in 'test' task

[iemejia] [BEAM-9450] Update www.apache.org/dist/ links to downloads.apache.org

[iemejia] [BEAM-9450] Convert links available via https to use https

[github] Add integration test for AnnotateVideoWithContext transform (#10986)

[lcwik] [BEAM-9452] Update classgraph to latest version to resolve windows

[hktang] [BEAM-9453] Changed new string creation to use StandardCharsets.UTF_8

[chuck.yang] Use Avro format for file loads to BigQuery

[jkai] [Hotfix] fix rabbitmp spotless check

[kcweaver] Downgrade cache log level from warn->info.

[github] Revert "[BEAM-6374] Emit PCollection metrics from GoSDK (#10942)"

[github] Merge pull request #11032 from [BEAM-8335] Display rather than logging

[github] Fix a bug in performance test for reading data from BigQuery (#11062)

[suztomo] grpc 1.27.2 and gax 1.54.0

[suztomo] bigquerystorage 0.125.0-beta

[apilloud] [BEAM-9463] Bump ZetaSQL to 2020.03.1

[lcwik] [BEAM-2939, BEAM-9458] Add deduplication transform for SplittableDoFns

[lcwik] [BEAM-9464] Fix WithKeys to respect parameterized types

[ankurgoenka] [BEAM-9465] Fire repeatedly in reshuffle

[lcwik] [BEAM-2939, BEAM-9458] Use deduplication transform for UnboundedSources

[echauchot] Fix wrong generated code comment.

[github] [BEAM-9396] Fix Docker image name in CoGBK test for Python on Flink

[lcwik] [BEAM-9288] Update to use vendored gRPC without shaded conscrypt

[github] [BEAM-9319] Clean up start topic in TestPubsubSignal (#11072)

[lcwik] [BEAM-2939] Follow-up on comment in pr/11065

[lcwik] [BEAM-9473] Dont copy over META-INF index/checksum/signing files during

[apilloud] [BEAM-9411] Enable BigQuery DIRECT_READ by default in SQL

[hannahjiang] update CHANGE.md for 2.20

[lcwik] [BEAM-9475] Fix typos and shore up expectations on type

[rohde.samuel] BEAM[8335] TestStreamService integration with DirectRunner

[github] [BEAM-7926] Update Data Visualization (#11020)

[ankurgoenka] [BEAM-9402] Remove options overwrite

[chadrik] Add pre-commit hook for pylint

[github] Additional new Python Katas (#11078)

[github] [BEAM-9478] Update samza runner page to reflect post 1.0 changes

[suztomo] grpc-google-cloud-pubsub-v1 1.85.1

[pabloem] Updating BigQuery client APIs

[github] [BEAM-9481] Exclude signature files from expansion service test

[github] Install typing package only for Python < 3.5.3 (#10821)

[heejong] [BEAM-9056] Staging artifacts from environment

[sunjincheng121] [BEAM-9295] Add Flink 1.10 build target and Make FlinkRunner compatible

[ankurgoenka] [BEAM-9485] Raise error when transform urn is not implemented

[12602502+Ardagan] [BEAM-9431] Remove ReadFromPubSub/Read-out0-ElementCount from the

[github] Update Python roadmap for 2.7 eol

[mxm] [BEAM-9474] Improve robustness of BundleFactory and ProcessEnvironment

[github] [BEAM-7815] update MemoryReporter comments about using guppy3 (#11073)

[rohde.samuel] [BEAM-8335] Modify the StreamingCache to subclass the CacheManager

[sunjincheng121] [BEAM-9298] Drop support for Flink 1.7

[github] Fixing apache_beam.io.gcp.bigquery_test:PubSubBigQueryIT. at head

[mxm] [BEAM-9490] Guard referencing for environment expiration via a lock

[github] Verify schema early in ToJson and JsonToRow (#11105)

[lcwik] [BEAM-9481] fix indentation

[github] Merge pull request #11103 from [BEAM-9494] Reifying outputs from BQ file

[github] [BEAM-8335] Implemented Capture Size limitation (#11050)

[github] [BEAM-9294] Move RowJsonException out of RowJsonSerializer (#11102)

[github] Merge pull request #11046: [BEAM-9442] Properly handle nullable fields

[ankurgoenka] [BEAM-9287] disable validates runner test which uses teststreams for

[sunjincheng121] [BEAM-9299-PR]Upgrade Flink Runner 1.8x to 1.8.3 and 1.9x to 1.9.2

[lcwik] [BEAM-2939] Implement interfaces and concrete watermark estimators

[ankurgoenka] [BEAM-9499] Sickbay test_multi_triggered_gbk_side_input for streaming

[robertwb] Minor cleanup, lint.

[robertwb] [BEAM-9433] Create expansion service artifact for common Java IOs.

[thw] [BEAM-9490] Use the lock that belongs to the cache when bundle load

[github] Update Dataflow py container version (#11120)

[github] [BEAM-7923] Streaming support and pipeline pruning when instrumenting a

[github] Update default value in Java snippet

[ankurgoenka] [BEAM-9504] Sickbay streaming test for batch VR

[rohde.samuel] [BEAM-8335] Final PR to merge the InteractiveBeam feature branch

[github] [BEAM-9477] RowCoder should be hashable and picklable (#11088)

[apilloud] [BEAM-8057] Reject Infinite or NaN literals at parse time

[robertwb] Log in a daemon thread.

[thw] [BEAM-8815] Skip removal of manifest when no artifacts were staged.

[github] [BEAM-9346] Improve the efficiency of TFRecordIO (#11122)

[kawaigin] [BEAM-8335] Refactor IPythonLogHandler

[apilloud] [BEAM-8070] Preserve type for empty array

[github] Merge pull request #10991 [BEAM-3301] Refactor DoFn validation & allow

[github] Update dataflow py container ver to 20200317 (#11145)


------------------------------------------
[...truncated 41.46 KB...]
3663b7fed4c9: Layer already exists
832f129ebea4: Layer already exists
6670e930ed33: Layer already exists
c7f27a4eb870: Layer already exists
e70dfb4c3a48: Layer already exists
1c76bd0dc325: Layer already exists
942cbcd085e7: Pushed
d397daf26739: Pushed
f9e232193edb: Pushed
d6fd43a02bed: Pushed
latest: digest: sha256:c1b85afd596d782ffd31095c7bc8df29ceeedf89263b869d83217f55e9cc8aab size: 3470
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability -Pdocker-tag=latest :runners:flink:1.10:job-server-container:docker
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:flink:1.10:copyResourcesOverrides NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :runners:flink:1.10:job-server:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :runners:flink:1.10:job-server-container:copyLicenses
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :runners:flink:1.10:job-server-container:dockerClean UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :runners:flink:1.10:copySourceOverrides
> Task :runners:flink:1.10:copyTestResourcesOverrides NO-SOURCE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :runners:flink:1.10:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.10:compileJava FROM-CACHE
> Task :runners:flink:1.10:classes
> Task :runners:flink:1.10:jar
> Task :runners:flink:1.10:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.10:job-server:classes UP-TO-DATE
> Task :runners:flink:1.10:job-server:shadowJar
> Task :runners:flink:1.10:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.10:job-server-container:dockerPrepare
> Task :runners:flink:1.10:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 50s
61 actionable tasks: 17 executed, 6 from cache, 38 up-to-date

Publishing build scan...
https://gradle.com/s/c2g3d26psl2ok

[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins83479482003134226.sh
+ echo 'Tagging image...'
Tagging image...
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7814280496248385176.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8468716113821446879.sh
+ echo 'Pushing image...'
Pushing image...
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7983500343853047775.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server]
e30da2b9bb4a: Preparing
e8d5e9cfec05: Preparing
613b04c78cd3: Preparing
adf3c5d11e2b: Preparing
e49fb42ba763: Preparing
ac3e2c206c49: Preparing
3663b7fed4c9: Preparing
832f129ebea4: Preparing
6670e930ed33: Preparing
c7f27a4eb870: Preparing
e70dfb4c3a48: Preparing
1c76bd0dc325: Preparing
ac3e2c206c49: Waiting
3663b7fed4c9: Waiting
832f129ebea4: Waiting
6670e930ed33: Waiting
c7f27a4eb870: Waiting
1c76bd0dc325: Waiting
e70dfb4c3a48: Waiting
613b04c78cd3: Pushed
e8d5e9cfec05: Pushed
ac3e2c206c49: Layer already exists
3663b7fed4c9: Layer already exists
e30da2b9bb4a: Pushed
832f129ebea4: Layer already exists
6670e930ed33: Layer already exists
c7f27a4eb870: Layer already exists
e70dfb4c3a48: Layer already exists
1c76bd0dc325: Layer already exists
e49fb42ba763: Pushed
adf3c5d11e2b: Pushed
latest: digest: sha256:a570b675986cff3e03758ce300e54ce56e11952fdc8b34bfabf0148ec163f08f size: 2841
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
CLUSTER_NAME=beam-loadtests-java-portable-flink-batch-10
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-batch-10
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1890540453373984587.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8945021343943663000.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-java-portable-flink-batch-10-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-batch-10 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/2520a60b-0c1e-312c-84b1-64de6c9a4e84].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................WARNING: Cluster beam-loadtests-java-portable-flink-batch-10 failed to create. Beginning automated resource cleanup process.
done.
ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/2520a60b-0c1e-312c-84b1-64de6c9a4e84] failed: Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/c6e801ab-0b6b-44cf-a6e3-2423dee63634/beam-loadtests-java-portable-flink-batch-10-m/dataproc-initialization-script-2_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Batch #9

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/9/display/redirect?page=changes>

Changes:

[amogh.tiwari] lzo-addons

[amogh.tiwari] 3rd dec 2019, 12:43AM

[amogh.tiwari] PR corrections

[amogh.tiwari] PR javaPreCommit update

[amogh.tiwari] PR changes: added testLzopSpilttale()

[amogh.tiwari] updated gradle for supporting optional dependency of lzo- 2:39 AM IST

[iemejia] [BEAM-9162] Upgrade Jackson to version 2.10.2

[veblush] Upgrades gcsio to 2.0.0

[jonathan.warburton09] [BEAM-8916] Rename external_test_it so that it is picked up by pytest

[huangry] Create validation runner test for metrics (limited to user counter in

[millsd] Update Dataflow monitoring URL

[ankurgoenka] [BEAM-9287] Add Python Streaming Validates runner tests for Unified

[robertwb] Add capabilities and requirements to beam protos.

[github] Change static Map fields in ReflectUtils to be concurrent

[iemejia] [BEAM-8561] Add ThriftIO to support IO for Thrift files

[github] [BEAM-9258] Integrate Google Cloud Data loss prevention functionality

[github] [BEAM-9291] Upload graph option in dataflow's python sdk (#10829)

[amogh.tiwari] update 19/02/2020 2:32 AM added static class, removed wrappers, updated

[chlarsen] Removed compile time generation of test Thrift class.

[github] [BEAM-1080] Skip tests that required GCP credentials

[github] Exclude tests that are not passing under currect Avro IO requirements.

[lcwik] [BEAM-5605] Honor the bounded source timestamps timestamp.

[chlarsen] Added ThriftIO to list of supported I/O on website and to change log.

[github] [BEAM-7246] Added Google Spanner Write Transform (#10712)

[github] Apply suggestions from code review

[github] [BEAM-1833] Fixes BEAM-1833

[bhulette] Don't exclude UsesUnboundedPCollections in Dataflow VR tests

[heejong] [BEAM-9335] update hard-coded coder id when translating Java external

[huangry] Fixups.

[github] [BEAM-9146] Integrate GCP Video Intelligence functionality for Python

[iemejia] Mark Test categories as internal and improve categorization

[github] Add DataCatalogPipelineOptionsRegistrar (#10896)

[github] Allow unknown non-merging WindowFns of know window type. (#10875)

[iemejia] [BEAM-9326] Make JsonToRow transform input <String> instead of <?

[github] [BEAM-8575] Removed MAX_TIMESTAMP from testing data (#10835)

[github] Update python sdk container to beam-master-20200219 (#10903)

[heejong] [BEAM-9338] add postcommit XVR spark badges

[github] [BEAM-3545] Fix race condition w/plan metrics. (#10906)

[robertwb] Update go beam runner generated protos.

[heejong] [BEAM-9341] postcommit xvr flink fix

[github] Update

[github] Update

[github] Update

[github] Update

[github] Update

[github] Update

[github] Update

[github] Update

[shubham.srivastava] finishing touch 20/02/2020 6:43PM

[github] [BEAM-9085] Fix performance regression in SyntheticSource (#10885)

[github] Update google-cloud-videointelligence dependency

[robertwb] Add standard protocol capabilities to protos.

[github] [BEAM-8280] no_annotations decorator (#10904)

[kcweaver] [BEAM-9225] Fix Flink uberjar job termination bug.

[kcweaver] Reuse get_state method.

[chamikara] Updates DataflowRunner to support multiple SDK environments.

[github] [BEAM-8280] Enable and improve IOTypeHints debug_str traceback (#10894)

[github] [BEAM-9343]Upgrade ZetaSQL to 2020.02.1 (#10918)

[robertwb] [BEAM-9339] Declare capabilities for Go SDK.

[lcwik] [BEAM-5605] Eagerly close the BoundedReader once we have read everything

[github] [BEAM-9229] Adding dependency information to Environment proto (#10733)

[lcwik] [BEAM-9349] Update joda-time version

[lcwik] fixup! Fix SpotBugs failure

[kcweaver] [BEAM-9022] publish Spark job server Docker image

[drubinstein] Bump google cloud bigquery to 1.24.0

[github] Revert "[BEAM-9085] Fix performance regression in SyntheticSource

[github] [BEAM-8537] Provide WatermarkEstimator to track watermark (#10375)

[github] Make sure calling try_claim(0) more than once also trows exception.

[robertwb] [BEAM-9339] Declare capabilities for Python SDK.

[robertwb] Add some standard requirement URNs to the protos.

[kcweaver] [BEAM-9356] reduce Flink test logs to warn

[github] [BEAM-9063] migrate docker images to apache (#10612)

[github] [BEAM-9252] Exclude jboss's Main and module-info.java (#10930)

[boyuanz] Clean up and add type-hints to SDF API

[robertwb] [BEAM-9340] Populate requirements for Python DoFn properties.

[hannahjiang] fix postcommit failure

[robertwb] [BEAM-8019] Branch on having multiple environments.

[github] [BEAM-9359] Switch to Data Catalog client (#10917)

[github] [BEAM-9344] Add support for bundle finalization execution to the Beam

[iemejia] [BEAM-9342] Upgrade vendored bytebuddy to version 1.10.8

[chadrik] Create a class to encapsulate the work required to submit a pipeline to

[iemejia] Add Dataflow Java11 ValidatesRunner badge to the PR template

[github] Merge pull request #10944: [BEAM-7274] optimize oneOf handling

[github] [BEAM-8280] Fix IOTypeHints origin traceback on partials (#10927)

[relax] Support null fields in rows with ByteBuddy generated code.

[robertwb] Allow metrics update to be tolerant to uninitalized metric containers.

[github] [GoSDK] Fix race condition in statemgr & test (#10941)

[rohde.samuel] Move TestStream implementation to replacement transform

[github] [BEAM-9347] Don't overwrite default runner harness for unified worker

[boyuanz] Update docstring of ManualWatermarkEstimator.set_watermark()

[kcweaver] [BEAM-9373] Spark/Flink tests fix string concat

[boyuanz] Address comments

[boyuanz] Address comments again

[github] [BEAM-9228] Support further partition for FnApi ListBuffer (#10847)

[github] [BEAM-7926] Data-centric Interactive Part3 (#10731)

[boyuanz] Use NoOpWatermarkEstimator in sdf_direct_runner

[chamikara] Updates Dataflow client

[github] [BEAM-9240]: Check for Nullability in typesEqual() method of FieldType

[amogh.tiwari] 25/02/2020 updated imports Amogh Tiwari & Shubham Srivastava

[iemejia] [BEAM-8616] Make hadoop-client a provided dependency on ParquetIO

[mxm] [BEAM-9345] Remove workaround to restore stdout/stderr during JobGraph

[iemejia] [BEAM-9364] Refactor KafkaIO to use DeserializerProviders

[mxm] [BEAM-9345] Add end-to-end Flink job submission test

[iemejia] [BEAM-9352] Align version of transitive jackson dependencies with Beam

[michal.walenia] [BEAM-9258] Add integration test for Cloud DLP

[iemejia] [BEAM-9329] Support request of schemas by version on KafkaIO + CSR

[lcwik] [BEAM-9252] Update to vendored gRPC without problematic

[github] Update

[github] Update

[github] Update

[lcwik] [BEAM-2822, BEAM-2939, BEAM-6189, BEAM-4374] Enable passing completed

[crites] Changes TestStreamTranscriptTest to only emit two elements so that its

[alex] [BEAM-7274] Add DynamicMessage Schema support

[github] [BEAM-9322] Fix tag output names within Dataflow to be consistent with

[iemejia] [BEAM-9342] Exclude module-info.class from vendored Byte Buddy 1.10.8

[iemejia] Add KafkaIO support for Confluent Schema Registry to the CHANGEs file

[github] [BEAM-9247] Integrate GCP Vision API functionality (#10959)

[github] Fix kotlin warnings (#10976)

[github] Update python sdk container version to beam-master-20200225 (#10965)

[github] [BEAM-9248] Integrate Google Cloud Natural Language functionality for

[iemejia] Refine access level for `sdks/java/extensions/protobuf`

[github] [BEAM-9355] Basic support for NewType (#10928)

[github] [BEAM-8979] reintroduce mypy-protobuf stub generation (#10734)

[github] [BEAM-8335] Background Caching job (#10899)

[github] [BEAM-8458] Add option to set temp dataset in BigQueryIO.Read (#9852)

[iemejia] Make logger naming consistent with Apache Beam LOG standard

[kcweaver] [BEAM-9300] convert struct literal in ZetaSQL

[github] fix breakage (#10934)

[github] Merge pull request #10901 from [BEAM-8965] Remove duplicate sideinputs

[pabloem] Fix formatting

[github] [BEAM-8618] Tear down unused DoFns periodically in Python SDK harness.

[alex] [BEAM-9394] DynamicMessage handling of empty map violates schema

[github] Merge pull request #10854: State timers documentation

[lcwik] [BEAM-5524] Fix minor issue in style guide.

[github] [BEAM-8201] Pass all other endpoints through provisioning service.

[suztomo] Linkage Checker 1.1.4

[robinyqiu] Bump Dataflow Java worker container version

[kcweaver] Test schema does not need to be nullable.

[github] [BEAM-9396] Match Docker image names in Jenkins jobs with those

[github] [BEAM-9392] Fix Multi TestStream assertion errors (#10982)


------------------------------------------
[...truncated 74.41 KB...]
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-batch-9 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/e9b25126-0b16-3418-aef0-0fb28347ec5a].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
.................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-batch-9] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-batch-9
Warning: Permanently added 'compute.4417651939462133680' (ECDSA) to the list of known hosts.
20/02/28 12:39:30 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-batch-9-m/10.128.0.195:8032
+ read line
+ echo application_1582893511539_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561
application_1582893511539_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561
++ echo application_1582893511539_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561
++ sed 's/ .*//'
+ application_ids[$i]=application_1582893511539_0001
++ echo application_1582893511539_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561
++ sed 's/.*beam-loadtests-java-portable-flink-batch-9/beam-loadtests-java-portable-flink-batch-9/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561'
Using Yarn Application master: beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-batch-9'
5e4161313bb76c5a3dd5dd297efe838acc63be99eec0534037d611402dbfe72e
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-batch-9-m '--command=curl -s "http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1582893511539_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-d4fcbc31-cbe1-41b0-a7fd-83782187721b"},{"key":"jobmanager.rpc.port","value":"44311"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1582893511539_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1582893511539_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-d4fcbc31-cbe1-41b0-a7fd-83782187721b"},{"key":"jobmanager.rpc.port","value":"44311"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1582893511539_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=44311
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m -- -L 8081:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 -L 44311:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:44311 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m -- -L 8081:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 -L 44311:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:44311 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m -- -L 8081:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 -L 44311:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:44311 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2942965095152362924.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_batch_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_batch_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=false --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console...
Exception in thread "main" java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1"
payload: "\n@gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest"
]
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:98)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:99)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1"
payload: "\n@gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest"
]
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1928)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:90)
	... 3 more
Caused by: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1"
payload: "\n@gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest"
]
	at org.apache.beam.runners.portability.JobServicePipelineResult.propagateErrors(JobServicePipelineResult.java:165)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:110)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
	at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 21s
61 actionable tasks: 8 executed, 7 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/cfmvfhrzd6ius

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Batch #8

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/8/display/redirect?page=changes>

Changes:

[chadrik] Add attributes defined in operations.pxd but missing in operations.py

[robertwb] Minor FnAPI proto cleanups.

[je.ik] [BEAM-9273] Explicitly disable @RequiresTimeSortedInput on unsupported

[je.ik] [BEAM-9273] code review - to be squashed

[kcweaver] [BEAM-9212] fix zetasql struct exception

[kcweaver] [BEAM-9211] Spark reuse Flink portable jar test script

[kcweaver] test_pipeline_jar Use single jar arg for both Flink and Spark.

[iemejia] Pin Avro dependency in Python SDK to be consistent with Avro versioning

[apilloud] [BEAM-9311] ZetaSQL Named Parameters are case-insensitive

[github] Bump dataflow container version (#10861)

[github] [BEAM-8335] Update StreamingCache with new Protos (#10856)

[github] [BEAM-9317] Fix portable test executions to specify the beam_fn_api

[je.ik] [BEAM-9265] @RequiresTimeSortedInput respects allowedLateness

[github] [BEAM-9289] Improve performance for metrics update of samza runner

[github] = instead of -eq

[iemejia] [BEAM-6857] Classify unbounded dynamic timers tests in the

[iemejia] Exclude Unbounded PCollection tests from Flink Portable runner batch

[github] [BEAM-9317] Fix Dataflow tests to not perform SplittableDoFn expansion

[iemejia] [BEAM-9315] Allow multiple paths via HADOOP_CONF_DIR in

[github] Update container images used by Dataflow runner with unreleased SDKs.

[github] [BEAM-9314] Make dot output deterministic (#10864)

[ccy] [BEAM-9277] Fix exception when running in IPython notebook.

[github] Remove experimental parallelization (-j 8) flags from sphinx

[iemejia] [BEAM-9301] Checkout the hash of master instead of the branch in beam

[github] [BEAM-8399] Add --hdfs_full_urls option (#10223)

[iemejia] Fix typo on runners/extensions-java label for github PR autolabeler

[github] Merge pull request #10862: [BEAM-9320] Add AlwaysFetched annotation


------------------------------------------
[...truncated 75.40 KB...]
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
.......................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-batch-8] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-8-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-batch-8
Warning: Permanently added 'compute.8564509543116519414' (ECDSA) to the list of known hosts.
20/02/17 12:40:18 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-batch-8-m/10.128.0.19:8032
+ read line
+ echo application_1581943162552_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
application_1581943162552_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
++ echo application_1581943162552_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
++ sed 's/ .*//'
+ application_ids[$i]=application_1581943162552_0001
++ echo application_1581943162552_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
++ sed 's/.*beam-loadtests-java-portable-flink-batch-8/beam-loadtests-java-portable-flink-batch-8/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449'
Using Yarn Application master: beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-8-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-batch-8'
38504cbe34588d820e4f9ffd8fa64e1cccde09f57c964e1d68f700d308074f4e
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-batch-8-m '--command=curl -s "http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581943162552_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-be019c21-1ebe-4485-91f0-3282c77c44ab"},{"key":"jobmanager.rpc.port","value":"40523"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581943162552_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581943162552_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-be019c21-1ebe-4485-91f0-3282c77c44ab"},{"key":"jobmanager.rpc.port","value":"40523"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581943162552_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=40523
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-8-m -- -L 8081:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449 -L 40523:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:40523 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-8-m -- -L 8081:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449 -L 40523:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:40523 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-8-m -- -L 8081:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449 -L 40523:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:40523 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5886835649746084439.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_batch_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_batch_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=false --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:shadowJar

> Task :sdks:java:io:synthetic:compileJava
Note: <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:io:synthetic:classes
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kinesis:compileJava
> Task :sdks:java:io:kinesis:classes
> Task :sdks:java:io:kinesis:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console...
Exception in thread "main" java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:98)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:99)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1928)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:90)
	... 3 more
Caused by: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
	at org.apache.beam.runners.portability.JobServicePipelineResult.propagateErrors(JobServicePipelineResult.java:165)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:110)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
	at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 30s
61 actionable tasks: 11 executed, 4 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/wk2sl6swex6u6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Batch #7

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/7/display/redirect?page=changes>

Changes:

[dpcollins] Move external PubsubIO hooks outside of PubsubIO.

[github] [BEAM-9188] CassandraIO split performance improvement - cache size of

[robertwb] Only cache first page of paginated state.

[robertwb] Perform bundle-level caching if no cache token is given.

[robertwb] [BEAM-8298] Support side input cache tokens.

[radoslaws] spotless fixes

[robertwb] fix continuation token iter

[robertwb] lint for side input tokens

[github] Rename "word" to "line" for better readability

[github] Rename "words" to "line" also in docs

[radoslaws] comments and tests

[suztomo] bigtable-client-core 1.13.0 and exclusion and gax

[robinyqiu] Cleanup ZetaSQLQueryPlanner and ExpressionConverter code

[suztomo] Controlling grpc-grpclb and grpc-core

[robertwb] Fix state cache test.

[robertwb] TODO about two-level caching.

[robertwb] CachingStateHandler unit test.

[github] "Upgrade" google-cloud-spanner version to 1.13.0

[github] Removing none instead of bare return

[michal.walenia] [BEAM-9226] Set max age of 3h for Dataproc Flink clusters

[je.ik] [BEAM-8550] @RequiresTimeSortedInput: working with legacy flink and

[kamil.wasilewski] Generate 100kB records in GroupByKey Load test 3

[robertwb] [BEAM-9227] Defer bounded source size estimation to the workers.

[chadrik] [BEAM-8271] Properly encode/decode StateGetRequest/Response

[github] [BEAM-8042] [ZetaSQL] Fix aggregate column reference (#10649)

[robertwb] test lint

[robertwb] Fix extending non-list.

[robertwb] Fix some missing (but unused) output_processor constructor arguments.

[chadrik] [BEAM-7746] Avoid errors about Unsupported operand types for >= ("int"

[robertwb] Fix flink counters test.

[github] [BEAM-8590] Support unsubscripted native types (#10042)

[github] Revert "[BEAM-9226] Set max age of 3h for Dataproc Flink clusters"

[radoslaws] spottless

[mxm] [BEAM-9132] Avoid logging misleading error messages during pipeline

[github] [BEAM-8889] Cleanup Beam to GCS connector interfacing code so it uses

[heejong] [BEAM-7961] Add tests for all runner native transforms and some widely

[github] [BEAM-9233] Support -buildmode=pie -ldflags=-w with unregistered Go

[github] [BEAM-9167] Metrics extraction refactoring. (#10716)

[kenn] Clarify exceptions in SQL modules

[github] Update Beam Python container release

[github] No longer reporting Lulls as errors in the worker.

[iemejia] [BEAM-9236] Mark missing Schema based classes and methods as

[iemejia] [BEAM-9236] Remove unneeded schema related class FieldValueSetterFactory

[iemejia] [BEAM-9236] Remove unused schema related class FieldValueGetterFactory

[iemejia] [BEAM-6857] Recategorize UsesTimerMap tests to ValidatesRunner

[hsuryawirawan] Update Beam Katas Java to use Beam version 2.18.0

[kamil.wasilewski] Remove some tests in Python GBK on Flink suite

[hsuryawirawan] Update Beam Katas Python to use Beam version 2.18.0

[kamil.wasilewski] [BEAM-9234] Avoid using unreleased versions of PerfKitBenchmarker

[github] Adding new source tests for Py BQ source (#10732)

[suztomo] Introducing google-http-client.version

[github] [BEAM-8280][BEAM-8629] Make IOTypeHints immutable (#10735)

[heejong] [BEAM-9230] Enable CrossLanguageValidateRunner test for Spark runner

[suztomo] Property google-api-client

[ehudm] [BEAM-8095] Remove no_xdist for test

[zyichi] Remove managing late data not supported by python sdk note

[echauchot] Embed audio podcasts players to webpage instead of links that play the

[iemejia] [BEAM-9236] Mark missing Schema based classes and methods as

[yoshiki.obata] [BEAM-9163] update sphinx_rtd_theme to newest

[iemejia] [BEAM-7310] Add support of Confluent Schema Registry for KafkaIO

[altay] Add CHANGES.md file

[robinyqiu] Support all ZetaSQL TIMESTAMP functions

[github] [BEAM-4150] Remove fallback case for coder not specified within

[github] [BEAM-9009] Add pytest-timeout plugin, set timeout (#10437)

[github] [BEAM-3221] Expand/clarify timestamp comments within

[boyuanz] Add new release 2.19.0 to beam website.

[boyuanz] Update beam 2.19.0 release blog

[ehudm] Convert repo.spring.io to use https + 1 other

[ehudm] [BEAM-9251] Fix :sdks:java:io:kafka:updateOfflineRepository

[gleb] Fix AvroIO javadoc for deprecated methods

[github] [BEAM-5605] Migrate splittable DoFn methods to use "new" DoFn style

[github] [BEAM-6703] Make Dataflow ValidatesRunner test use Java 11 in test

[daniel.o.programmer] [BEAM-3301] Small cleanup to FullValue code.

[apilloud] [BEAM-8630] Add logical types, make public

[github] [BEAM-9037] Instant and duration as logical type (#10486)

[github] [BEAM-2645] Define the display data model type

[kamil.wasilewski] [BEAM-9175] Add yapf autoformatter

[kamil.wasilewski] [BEAM-9175] Yapf everywhere!

[kamil.wasilewski] [BEAM-9175] Fix pylint issues

[kamil.wasilewski] [BEAM-9175] Add pre-commit Jenkins job

[kamil.wasilewski] [BEAM-9175] Disable bad-continuation check in pylint

[amyrvold] [BEAM-9261] Add LICENSE and NOTICE to Docker images

[github] [BEAM-8951] Stop using nose in load tests (#10435)

[robertwb] [BEAM-7746] Cleanup historical DnFnRunner-as-Receiver cruft.

[robertwb] [BEAM-8976] Initalize logging configuration at a couple of other entry

[chadrik] [BEAM-7746] Add typing for try_split

[zyichi] Fix race exception in python worker status thread dump

[iemejia] [BEAM-9264] Upgrade Spark to version 2.4.5

[hsuryawirawan] Update Beam Katas Java to use Beam version 2.19.0

[hsuryawirawan] Update Beam Katas Python to use Beam version 2.19.0

[hsuryawirawan] Update Beam Katas Python on Stepik

[hsuryawirawan] Update Built-in IOs task type to theory

[hsuryawirawan] Update Beam Katas Java on Stepik

[kamil.wasilewski] Fix method name in Combine and coGBK tests

[github] [BEAM-3453] Use project specified in pipeline_options when creating

[robertwb] [BEAM-9266] Remove unused fields from provisioning API.

[github] [BEAM-9262] Clean-up endpoints.proto to a stable state (#10789)

[lcwik] [BEAM-3595] Migrate to "v1" URNs for standard window fns.

[daniel.o.programmer] [BEAM-3301] (Go SDK) Adding restriction plumbing to graph construction.

[robertwb] Remove one more reference to provision resources.

[github] Merge pull request #10766: [BEAM-4461] Add Selected.flattenedSchema

[robertwb] Reject unsupported WindowFns and Window types.

[github] Merge pull request #10804: [BEAM-2535] Fix timer map

[github] Merge pull request #10627:[BEAM-2535] Support outputTimestamp and

[iemejia] [BEAM-7092] Fix invalid import of Guava coming from transitive Spark dep

[alex] [BEAM-9241] Fix inconsistent proto nullability

[kamil.wasilewski] Move imports and variables out of global namespace

[iemejia] [BEAM-9281] Update commons-csv to version 1.8

[iemejia] [website] Update Java 11 and Spark roadmap

[apilloud] [BEAM-8630] Validate prepared expression on expand

[github] [BEAM-9268] SpannerIO: Add more documentation and warnings for unknown

[iemejia] [BEAM-9231] Add Experimental(Kind.PORTABILITY) and tag related classes

[iemejia] [BEAM-9231] Tag SplittableDoFn related classes/methods as Experimental

[iemejia] [BEAM-9231] Make Experimental annotations homogeneous in

[iemejia] [BEAM-9231] Untag Experimental/Internal classes not needed to write

[iemejia] [BEAM-9231] Tag beam-sdks-java-core internal classes as Internal

[iemejia] [BEAM-9231] Tag DoFn.OnTimerContext as Experimental(Kind.TIMERS)

[iemejia] [BEAM-9231] Tag Experimental/Internal packages in beam-sdks-java-core

[iemejia] [BEAM-9231] Tag Experimental/Internal packages in IOs and extensions

[iemejia] [BEAM-9231] Tag public but internal IOs and extensions classes as

[yoshiki.obata] [BEAM-7198] rename ToStringCoder to ToBytesCoder for proper

[iemejia] [BEAM-9160] Update AWS SDK to support Pod Level Identity

[yoshiki.obata] [BEAM-7198] add comment

[ankurgoenka] [BEAM-9290] Support runner_harness_container_image in released python

[boyuanz] Move ThreadsafeRestrictionTracker and RestrictionTrackerView out from

[github] Remove tables and refer to dependency locations in code (#10745)

[ehudm] fix lint

[valentyn] Cleanup MappingProxy reducer since dill supports it natively now.

[suztomo] beam-linkage-check.sh

[iemejia] Enable probot autolabeler action to label github pull requests

[iemejia] Remove prefixes in autolabeler configuration to improve readability

[iemejia] [BEAM-9160] Removed WebIdentityTokenCredentialsProvider explicit json

[suztomo] copyright

[yoshiki.obata] [BEAM-7198] fixup: reformatted with yapf

[github] [BEAM-3221] Clarify documentation for StandardTransforms.Primitives,

[aromanenko.dev] [BEAM-9292] Provide an ability to specify additional maven repositories

[aromanenko.dev] [BEAM-9292] KafkaIO: add io.confluent repository to published POM

[github] [BEAM-8201] Add other endpoint fields to provision API. (#10839)

[github] [BEAM-9269] Add commit deadline for Spanner writes. (#10752)

[github] [AVRO-2737] Exclude a buggy avro version from requirements spec.

[iemejia] Refine labels/categories for PR autolabeling

[github] Update roadmap page for python 3 support

[iemejia] [BEAM-9160] Removed WebIdentityTokenCredentialsProvider explicit json

[iemejia] Remove unused ReduceFnRunnerHelper class

[iemejia] Do not set options.filesToStage in case of spark local execution in

[iemejia] Do not set options.filesToStage in case of spark local execution in

[github] [BEAM-6522] [BEAM-7455] Unskip Avro IO tests that are now passing.

[github] [BEAM-5605] Convert all BoundedSources to SplittableDoFns when using

[github] [BEAM-8758] Google-cloud-spanner upgrade to 1.49.1 (#10765)

[github] Ensuring appropriate write_disposition and create_disposition for jobs

[github] [BEAM-3545] Return metrics as MonitoringInfos (#10777)

[github] Modify the TestStreamFileRecord to use TestStreamPayload events.

[iemejia] [BEAM-9280] Update commons-compress to version 1.20


------------------------------------------
[...truncated 74.14 KB...]
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-batch-7 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/a36ba782-85d9-3580-b69e-dd4915b47c83].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
..................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-batch-7] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-7-m '--command=yarn application -list'
++ grep beam-loadtests-java-portable-flink-batch-7
Warning: Permanently added 'compute.6917959991117425346' (ECDSA) to the list of known hosts.
20/02/14 12:38:26 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-batch-7-m/10.128.0.38:8032
+ read line
+ echo application_1581683849865_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
application_1581683849865_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
++ echo application_1581683849865_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
++ sed 's/ .*//'
+ application_ids[$i]=application_1581683849865_0001
++ echo application_1581683849865_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
++ sed 's/.*beam-loadtests-java-portable-flink-batch-7/beam-loadtests-java-portable-flink-batch-7/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
+ echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161'
Using Yarn Application master: beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-7-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-batch-7'
a99791781c123f8b0716bf373e7df3c608ede5fc98f2b98b7246a7c432b71ee7
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-batch-7-m '--command=curl -s "http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581683849865_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-901b7595-b07c-4fa8-917c-2a8f3e5c5f4e"},{"key":"jobmanager.rpc.port","value":"41357"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581683849865_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581683849865_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-901b7595-b07c-4fa8-917c-2a8f3e5c5f4e"},{"key":"jobmanager.rpc.port","value":"41357"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581683849865_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=41357
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-7-m -- -L 8081:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161 -L 41357:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:41357 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-7-m -- -L 8081:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161 -L 41357:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:41357 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-7-m -- -L 8081:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161 -L 41357:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:41357 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6759950651280737447.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_batch_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_batch_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=false --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :runners:local-java:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console...
Exception in thread "main" java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:98)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:99)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
	at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1928)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:90)
	... 3 more
Caused by: java.lang.RuntimeException: The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
	at org.apache.beam.runners.portability.JobServicePipelineResult.propagateErrors(JobServicePipelineResult.java:165)
	at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:110)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
	at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 23s
61 actionable tasks: 8 executed, 7 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/637yskmzdqivk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Java_Combine_Portable_Flink_Batch - Build # 6 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Java_Combine_Portable_Flink_Batch (build #6)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/6/ to view the results.