You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/22 13:00:16 UTC

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #10

See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/10/display/redirect?page=changes>

Changes:

[github] Add a timeout to urlopen calls

[dcavazos] Add Python snippet for FlatMap transform

[aaltay] [Beam-6696] GroupIntoBatches transform for Python SDK (#8914)

------------------------------------------
[...truncated 58.06 KB...]
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.7:compileJava FROM-CACHE
> Task :runners:flink:1.7:classes
> Task :runners:flink:1.7:jar
> Task :runners:flink:1.7:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.7:job-server:classes UP-TO-DATE
> Task :runners:flink:1.7:job-server:shadowJar
> Task :runners:flink:1.7:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.7:job-server-container:dockerPrepare
> Task :runners:flink:1.7:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 4s
51 actionable tasks: 35 executed, 15 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/ogvpf5szmhug4

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1733236549533654445.sh
+ echo 'Tagging Flink Job Server'\''s image...'
Tagging Flink Job Server's image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5592084994158686576.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3520524141951902382.sh
+ echo 'Pushing Flink Job Server'\''s image...'
Pushing Flink Job Server's image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4281532832493633141.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink-job-server]
2e9b4b1d9fcd: Preparing
b662b8b01ed4: Preparing
358f6350b03d: Preparing
f7d12d471667: Preparing
f350d0146bb3: Preparing
e38df31d449c: Preparing
af5ae4841776: Preparing
b17cc31e431b: Preparing
12cb127eee44: Preparing
604829a174eb: Preparing
fbb641a8b943: Preparing
e38df31d449c: Waiting
af5ae4841776: Waiting
b17cc31e431b: Waiting
12cb127eee44: Waiting
604829a174eb: Waiting
fbb641a8b943: Waiting
f7d12d471667: Layer already exists
f350d0146bb3: Layer already exists
e38df31d449c: Layer already exists
af5ae4841776: Layer already exists
b17cc31e431b: Layer already exists
12cb127eee44: Layer already exists
604829a174eb: Layer already exists
fbb641a8b943: Layer already exists
2e9b4b1d9fcd: Pushed
358f6350b03d: Pushed
b662b8b01ed4: Pushed
latest: digest: sha256:1f87e52dcd729c8e174516c97bf3fbd581e6bd6d9a26dcbb7109a5a8a46e994e size: 2632
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
CLUSTER_NAME=beam-loadtests-python-gbk-flink-batch-10
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-10
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8493778507221580143.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4122524772674242995.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc/>
+ ./create_flink_cluster.sh
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-10-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ main
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.1 KiB]                                                / [3 files][ 13.1 KiB/ 13.1 KiB]                                                -
Operation completed over 3 objects/13.1 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-10 --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/024eba2a-1677-3060-9e39-43e87e41a870].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
....................................................................................................................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-10] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-10-m '--command=yarn application -list'
++ grep beam-loadtests-python-gbk-flink-batch-10
Warning: Permanently added 'compute.6615369701094824912' (ECDSA) to the list of known hosts.
19/06/22 13:00:06 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-gbk-flink-batch-10-m/10.128.0.54:8032
+ read line
+ echo application_1561208239147_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501
application_1561208239147_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501
++ sed 's/ .*//'
++ echo application_1561208239147_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501
+ application_ids[$i]=application_1561208239147_0001
++ echo application_1561208239147_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501
++ sed 's/.*beam-loadtests-python-gbk-flink-batch-10/beam-loadtests-python-gbk-flink-batch-10/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501
+ echo 'Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501'
Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-10-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest --flink-master-url=beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-10'
28fc8802c035e7310dfa40fe282b1746288a9db72c55d833b96cf58dbced963a
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-gbk-flink-batch-10-m '--command=curl -s "http://beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"high-availability.cluster-id","value":"application_1561208239147_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"web.tmpdir","value":"/tmp/flink-web-f5b5a7fb-1e3a-4fc6-b3b6-548c179da40b"},{"key":"jobmanager.rpc.port","value":"39897"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"rest.port","value":"0"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1561208239147_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal
++ echo '[{"key":"web.port","value":"0"},{"key":"high-availability.cluster-id","value":"application_1561208239147_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"web.tmpdir","value":"/tmp/flink-web-f5b5a7fb-1e3a-4fc6-b3b6-548c179da40b"},{"key":"jobmanager.rpc.port","value":"39897"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"rest.port","value":"0"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1561208239147_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
+ local jobmanager_rpc_port=39897
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-10-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501 -L 39897:beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:39897 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-10-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501 -L 39897:beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:39897 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-10-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:36501 -L 39897:beam-loadtests-python-gbk-flink-batch-10-w-4.c.apache-beam-testing.internal:39897 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5407818162652708671.sh
+ echo src Load test: 2GB of 10B records src
src Load test: 2GB of 10B records src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -PloadTest.mainClass=apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey -Prunner=PortableRunner '-PloadTest.args=--job_name=load_tests_Python_Flink_Batch_GBK_1_0622100137 --publish_to_big_query=false --project=apache-beam-testing --metrics_dataset=load_test --metrics_table=python_flink_batch_GBK_1 --input_options='{"num_records": 200000000,"key_size": 1,"value_size":9}' --iterations=1 --fanout=1 --parallelism=5 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/python:latest --environment_type=DOCKER --runner=PortableRunner' :sdks:python:apache_beam:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.

FAILURE: Build failed with an exception.

* What went wrong:
Project 'load-tests' not found in project ':sdks:python:apache_beam:testing'. Some candidates are: 'load_tests'.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 0s

Publishing build scan...
https://gradle.com/s/jmdczz2nhn33a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_GBK_Flink_Batch #12

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/12/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #11

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/11/display/redirect?page=changes>

Changes:

[hsuryawirawan] Add README files on how to setup the project for both Java and Python

------------------------------------------
[...truncated 58.08 KB...]
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :sdks:java:fn-execution:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.7:compileJava FROM-CACHE
> Task :runners:flink:1.7:classes
> Task :runners:flink:1.7:jar
> Task :runners:flink:1.7:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.7:job-server:classes UP-TO-DATE
> Task :runners:flink:1.7:job-server:shadowJar
> Task :runners:flink:1.7:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.7:job-server-container:dockerPrepare
> Task :runners:flink:1.7:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 53s
51 actionable tasks: 35 executed, 15 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/2qtexbelfdife

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4415632668223118886.sh
+ echo 'Tagging Flink Job Server'\''s image...'
Tagging Flink Job Server's image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6275734230057591254.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7745661181701273630.sh
+ echo 'Pushing Flink Job Server'\''s image...'
Pushing Flink Job Server's image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6288517583259992829.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink-job-server]
0d35dd5bccd6: Preparing
6348e2a4bdf7: Preparing
3a6df669ed25: Preparing
f7d12d471667: Preparing
f350d0146bb3: Preparing
e38df31d449c: Preparing
af5ae4841776: Preparing
b17cc31e431b: Preparing
12cb127eee44: Preparing
604829a174eb: Preparing
fbb641a8b943: Preparing
e38df31d449c: Waiting
af5ae4841776: Waiting
b17cc31e431b: Waiting
12cb127eee44: Waiting
604829a174eb: Waiting
fbb641a8b943: Waiting
f7d12d471667: Layer already exists
f350d0146bb3: Layer already exists
e38df31d449c: Layer already exists
af5ae4841776: Layer already exists
b17cc31e431b: Layer already exists
12cb127eee44: Layer already exists
604829a174eb: Layer already exists
fbb641a8b943: Layer already exists
0d35dd5bccd6: Pushed
3a6df669ed25: Pushed
6348e2a4bdf7: Pushed
latest: digest: sha256:9a393e7755f37123147f996490e14bc1cdf08fbf156877a5de1adf7c2fa5bbf8 size: 2632
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
CLUSTER_NAME=beam-loadtests-python-gbk-flink-batch-11
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-11
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6217211456551548791.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1199360838988925289.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc/>
+ ./create_flink_cluster.sh
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-11-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ main
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.1 KiB]                                                / [3 files][ 13.1 KiB/ 13.1 KiB]                                                -
Operation completed over 3 objects/13.1 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-11 --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/c59c74be-6c74-3b94-a421-450c05358dea].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
................................................................................................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-11] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-11-m '--command=yarn application -list'
++ grep beam-loadtests-python-gbk-flink-batch-11
Warning: Permanently added 'compute.2275112537599166585' (ECDSA) to the list of known hosts.
19/06/23 12:59:04 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-gbk-flink-batch-11-m/10.128.0.50:8032
+ read line
+ echo application_1561294615077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233
application_1561294615077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233
++ echo application_1561294615077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233
++ sed 's/ .*//'
+ application_ids[$i]=application_1561294615077_0001
++ echo application_1561294615077_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233
++ sed 's/.*beam-loadtests-python-gbk-flink-batch-11/beam-loadtests-python-gbk-flink-batch-11/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233
+ echo 'Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233'
Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-11-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest --flink-master-url=beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-11'
18cbb3572f68276de1cc708b9670d2871fac2ec3be94383992302fde95c659ef
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-gbk-flink-batch-11-m '--command=curl -s "http://beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"high-availability.cluster-id","value":"application_1561294615077_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"web.tmpdir","value":"/tmp/flink-web-d27daf15-b9b1-41b4-b4e5-6ee40af875de"},{"key":"jobmanager.rpc.port","value":"45643"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"rest.port","value":"0"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1561294615077_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal
++ echo '[{"key":"web.port","value":"0"},{"key":"high-availability.cluster-id","value":"application_1561294615077_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"web.tmpdir","value":"/tmp/flink-web-d27daf15-b9b1-41b4-b4e5-6ee40af875de"},{"key":"jobmanager.rpc.port","value":"45643"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"rest.port","value":"0"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1561294615077_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
+ local jobmanager_rpc_port=45643
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-11-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233 -L 45643:beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:45643 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-11-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233 -L 45643:beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:45643 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-11-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:42233 -L 45643:beam-loadtests-python-gbk-flink-batch-11-w-4.c.apache-beam-testing.internal:45643 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1702713479438168611.sh
+ echo src Load test: 2GB of 10B records src
src Load test: 2GB of 10B records src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -PloadTest.mainClass=apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey -Prunner=PortableRunner '-PloadTest.args=--job_name=load_tests_Python_Flink_Batch_GBK_1_0623100140 --publish_to_big_query=false --project=apache-beam-testing --metrics_dataset=load_test --metrics_table=python_flink_batch_GBK_1 --input_options='{"num_records": 200000000,"key_size": 1,"value_size":9}' --iterations=1 --fanout=1 --parallelism=5 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/python:latest --environment_type=DOCKER --runner=PortableRunner' :sdks:python:apache_beam:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.

FAILURE: Build failed with an exception.

* What went wrong:
Project 'load-tests' not found in project ':sdks:python:apache_beam:testing'. Some candidates are: 'load_tests'.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1s

Publishing build scan...
https://gradle.com/s/4eqpqluqvzzki

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org