You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/08/04 16:17:09 UTC

Build failed in Jenkins: beam_LoadTests_Python_coGBK_Flink_Batch #12

See <https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/12/display/redirect>

------------------------------------------
[...truncated 59.47 KB...]
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.7:compileJava FROM-CACHE
> Task :runners:flink:1.7:classes
> Task :runners:flink:1.7:jar
> Task :runners:flink:1.7:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.7:job-server:classes UP-TO-DATE
> Task :runners:flink:1.7:job-server:shadowJar
> Task :runners:flink:1.7:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.7:job-server-container:dockerPrepare
> Task :runners:flink:1.7:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 53s
51 actionable tasks: 35 executed, 15 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/wszcnnvg6uti6

[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4875698410465915100.sh
+ echo 'Tagging Flink Job Server'\''s image...'
Tagging Flink Job Server's image...
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6938888934517528592.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6232505140282706308.sh
+ echo 'Pushing Flink Job Server'\''s image...'
Pushing Flink Job Server's image...
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6628265157978356234.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink-job-server]
42fd9683c0b4: Preparing
c27b527de40a: Preparing
a514369b8d1a: Preparing
b575bb71cbb6: Preparing
7955da51da82: Preparing
c64652873162: Preparing
a4e797bc3f15: Preparing
392f356944ff: Preparing
15210a41d4ee: Preparing
e2a8a00a83b2: Preparing
c64652873162: Waiting
a4e797bc3f15: Waiting
15210a41d4ee: Waiting
392f356944ff: Waiting
e2a8a00a83b2: Waiting
7955da51da82: Layer already exists
b575bb71cbb6: Layer already exists
c64652873162: Layer already exists
a4e797bc3f15: Layer already exists
392f356944ff: Layer already exists
15210a41d4ee: Layer already exists
e2a8a00a83b2: Layer already exists
42fd9683c0b4: Pushed
a514369b8d1a: Pushed
c27b527de40a: Pushed
latest: digest: sha256:bcb92b4e76dbea4b3d21171a7b23138fca77d68e8ead72bb3f3bf4f596d599e8 size: 2426
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
CLUSTER_NAME=beam-loadtests-python-cogbk-flink-batch-12
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-cogbk-flink-batch-12
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1580304556334057550.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8693752001616707969.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/.test-infra/dataproc/>
+ ./create_flink_cluster.sh
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-cogbk-flink-batch-12-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ main
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.1 KiB]                                                / [3 files][ 13.1 KiB/ 13.1 KiB]                                                -
Operation completed over 3 objects/13.1 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-cogbk-flink-batch-12 --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/e420016d-d800-393a-90cc-cfa881220d1d].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
...............................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-cogbk-flink-batch-12] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-12-m '--command=yarn application -list'
++ grep beam-loadtests-python-cogbk-flink-batch-12
Warning: Permanently added 'compute.7997065804118670037' (ECDSA) to the list of known hosts.
19/08/04 16:16:59 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-cogbk-flink-batch-12-m/10.128.0.48:8032
+ read line
+ echo application_1564935347252_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419
application_1564935347252_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419
++ echo application_1564935347252_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419
++ sed 's/ .*//'
+ application_ids[$i]=application_1564935347252_0001
++ echo application_1564935347252_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419
++ sed 's/.*beam-loadtests-python-cogbk-flink-batch-12/beam-loadtests-python-cogbk-flink-batch-12/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419
+ echo 'Using Yarn Application master: beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419'
Using Yarn Application master: beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-12-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest --flink-master-url=beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-cogbk-flink-batch-12'
a7e7c883d73540fb37c33c53f37377a985e5f353579a0490b76002bfc06795c6
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-cogbk-flink-batch-12-m '--command=curl -s "http://beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"high-availability.cluster-id","value":"application_1564935347252_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"web.tmpdir","value":"/tmp/flink-web-389403aa-d0c9-4cf9-a376-09e80a8c1d4c"},{"key":"jobmanager.rpc.port","value":"44379"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"rest.port","value":"0"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1564935347252_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal
++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
++ echo '[{"key":"web.port","value":"0"},{"key":"high-availability.cluster-id","value":"application_1564935347252_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"web.tmpdir","value":"/tmp/flink-web-389403aa-d0c9-4cf9-a376-09e80a8c1d4c"},{"key":"jobmanager.rpc.port","value":"44379"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"rest.port","value":"0"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1564935347252_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=44379
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-12-m -- -L 8081:beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419 -L 44379:beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:44379 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-12-m -- -L 8081:beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419 -L 44379:beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:44379 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-12-m -- -L 8081:beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:35419 -L 44379:beam-loadtests-python-cogbk-flink-batch-12-w-3.c.apache-beam-testing.internal:44379 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3105579646954398175.sh
+ echo src CoGroupByKey Python Load test: 2GB of 100B records with a single key src
src CoGroupByKey Python Load test: 2GB of 100B records with a single key src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -PloadTest.mainClass=apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey -Prunner=PortableRunner '-PloadTest.args=--project=apache-beam-testing --job_name=load-tests-python-flink-batch-cogbk-1-0804150410 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=load_test --metrics_table=python_flink_batch_cogbk_1 --input_options='{"num_records": 20000000,"key_size": 10,"value_size": 90,"num_hot_keys": 1,"hot_key_fraction": 1}' --co_input_options='{"num_records": 20000000,"key_size": 10,"value_size": 90,"num_hot_keys": 1,"hot_key_fraction": 1}' --iterations=1 --parallelism=5 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/python:latest --environment_type=DOCKER --runner=PortableRunner' :sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 44

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'files' for task ':sdks:python:apache_beam:testing:load_tests:run' of type org.gradle.api.tasks.Exec.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1s

Publishing build scan...
https://gradle.com/s/qpjeoloiwoyf6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_coGBK_Flink_Batch #14

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/14/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_coGBK_Flink_Batch #13

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/13/display/redirect?page=changes>

Changes:

[github] Update design-documents.md

------------------------------------------
[...truncated 59.41 KB...]
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.7:compileJava FROM-CACHE
> Task :runners:flink:1.7:classes
> Task :runners:flink:1.7:jar
> Task :runners:flink:1.7:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.7:job-server:classes UP-TO-DATE
> Task :runners:flink:1.7:job-server:shadowJar
> Task :runners:flink:1.7:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.7:job-server-container:dockerPrepare
> Task :runners:flink:1.7:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 57s
51 actionable tasks: 35 executed, 15 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/oqejbev3e3kpm

[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1076156037655043151.sh
+ echo 'Tagging Flink Job Server'\''s image...'
Tagging Flink Job Server's image...
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7889722546069523880.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8917468682381877198.sh
+ echo 'Pushing Flink Job Server'\''s image...'
Pushing Flink Job Server's image...
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8032631984051379119.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink-job-server]
3f1f92cb063d: Preparing
2242c41c0a53: Preparing
64326e25291d: Preparing
b575bb71cbb6: Preparing
7955da51da82: Preparing
c64652873162: Preparing
a4e797bc3f15: Preparing
392f356944ff: Preparing
15210a41d4ee: Preparing
e2a8a00a83b2: Preparing
15210a41d4ee: Waiting
a4e797bc3f15: Waiting
e2a8a00a83b2: Waiting
392f356944ff: Waiting
c64652873162: Waiting
b575bb71cbb6: Layer already exists
7955da51da82: Layer already exists
c64652873162: Layer already exists
a4e797bc3f15: Layer already exists
392f356944ff: Layer already exists
15210a41d4ee: Layer already exists
e2a8a00a83b2: Layer already exists
3f1f92cb063d: Pushed
64326e25291d: Pushed
2242c41c0a53: Pushed
latest: digest: sha256:e52e53813e1aa45b6b5910dacd62724084cc39245d789f4c92b04235051e11f3 size: 2426
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
CLUSTER_NAME=beam-loadtests-python-cogbk-flink-batch-13
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-cogbk-flink-batch-13
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8460905373378483767.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7551071015972111761.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/.test-infra/dataproc/>
+ ./create_flink_cluster.sh
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-cogbk-flink-batch-13-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ main
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.1 KiB]                                                / [3 files][ 13.1 KiB/ 13.1 KiB]                                                -
Operation completed over 3 objects/13.1 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-cogbk-flink-batch-13 --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/8e64f8d8-ad99-39d6-84fe-bc4b752f497e].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
..............................................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-cogbk-flink-batch-13] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-13-m '--command=yarn application -list'
++ grep beam-loadtests-python-cogbk-flink-batch-13
Warning: Permanently added 'compute.4576026161312033541' (ECDSA) to the list of known hosts.
19/08/05 16:18:02 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-cogbk-flink-batch-13-m/10.128.0.8:8032
+ read line
+ echo application_1565021802991_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465
application_1565021802991_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465
++ echo application_1565021802991_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465
++ sed 's/ .*//'
+ application_ids[$i]=application_1565021802991_0001
++ echo application_1565021802991_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465
++ sed 's/.*beam-loadtests-python-cogbk-flink-batch-13/beam-loadtests-python-cogbk-flink-batch-13/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465
+ echo 'Using Yarn Application master: beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465'
Using Yarn Application master: beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-13-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest --flink-master-url=beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-cogbk-flink-batch-13'
294a620d4f24cb5e4231036224968a8485cdec92e50ef3e47553dd017d81b6fc
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-cogbk-flink-batch-13-m '--command=curl -s "http://beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"high-availability.cluster-id","value":"application_1565021802991_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"web.tmpdir","value":"/tmp/flink-web-7ee3c607-13fb-42fd-a87e-1497faf86d69"},{"key":"jobmanager.rpc.port","value":"42117"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"rest.port","value":"0"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1565021802991_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal
++ echo '[{"key":"web.port","value":"0"},{"key":"high-availability.cluster-id","value":"application_1565021802991_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"web.tmpdir","value":"/tmp/flink-web-7ee3c607-13fb-42fd-a87e-1497faf86d69"},{"key":"jobmanager.rpc.port","value":"42117"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"rest.port","value":"0"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1565021802991_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
+ local jobmanager_rpc_port=42117
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-13-m -- -L 8081:beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465 -L 42117:beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:42117 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-13-m -- -L 8081:beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465 -L 42117:beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:42117 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-cogbk-flink-batch-13-m -- -L 8081:beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:46465 -L 42117:beam-loadtests-python-cogbk-flink-batch-13-w-3.c.apache-beam-testing.internal:42117 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Python_coGBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5527666238829184230.sh
+ echo src CoGroupByKey Python Load test: 2GB of 100B records with a single key src
src CoGroupByKey Python Load test: 2GB of 100B records with a single key src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -PloadTest.mainClass=apache_beam.testing.load_tests.co_group_by_key_test:CoGroupByKeyTest.testCoGroupByKey -Prunner=PortableRunner '-PloadTest.args=--project=apache-beam-testing --job_name=load-tests-python-flink-batch-cogbk-1-0805150421 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=load_test --metrics_table=python_flink_batch_cogbk_1 --input_options='{"num_records": 20000000,"key_size": 10,"value_size": 90,"num_hot_keys": 1,"hot_key_fraction": 1}' --co_input_options='{"num_records": 20000000,"key_size": 10,"value_size": 90,"num_hot_keys": 1,"hot_key_fraction": 1}' --iterations=1 --parallelism=5 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/python:latest --environment_type=DOCKER --runner=PortableRunner' :sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 44

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'files' for task ':sdks:python:apache_beam:testing:load_tests:run' of type org.gradle.api.tasks.Exec.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1s

Publishing build scan...
https://gradle.com/s/2ycmb5evzqhom

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org