You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/18 13:06:05 UTC

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #6

See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/6/display/redirect?page=changes>

Changes:

[github] Use code.jquery.com for jQuery CDN

[millsd] Use StateTags.ID_EQUIVALENCE when using comparing StateTags

[valentyn] Add Python 3.6, 3.7 to the list of supported version classifiers.

[chamikara] Fixes filesystem_test for Windows.

[github] use https

[alireza4263] [BEAM-7513] Implements row estimation for BigQuery.

[aaltay] [BEAM-6777] Let HealthzServlet respond actual health information of SDK

[melissapa] Merge pull request #8836: Add Beam Katas to the website's "Learning

[amaliujia] [BEAM-7461] disalbe flaky tests due to flaky

[github] Revert "[BEAM-7513] Adding Row Count for Bigquery Table"

[github] [BEAM-7467] Add dependency classifier to published pom (#8868)

[github] Fixing file naming for windows (#8870)

------------------------------------------
[...truncated 44.86 KB...]
604829a174eb: Preparing
fbb641a8b943: Preparing
20c68c80db0c: Waiting
604829a174eb: Waiting
55fab7dcb871: Waiting
b636a60da28a: Waiting
fbb641a8b943: Waiting
d4a2e9be4373: Waiting
12cb127eee44: Waiting
b17cc31e431b: Waiting
5822a1e89c31: Waiting
6d1e926aa552: Pushed
38563a5e2e1f: Pushed
20c68c80db0c: Layer already exists
b636a60da28a: Layer already exists
55fab7dcb871: Layer already exists
d4a2e9be4373: Layer already exists
0fe19df8b8f8: Layer already exists
b17cc31e431b: Layer already exists
12cb127eee44: Layer already exists
604829a174eb: Layer already exists
fbb641a8b943: Layer already exists
7a5be8d3f893: Pushed
5822a1e89c31: Pushed
eed6574689c2: Pushed
2901f141149b: Pushed
latest: digest: sha256:92d6fb3b7621d35c40139673b802fd6a619cd611ba71cc45de0b51e60833b600 size: 3486
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2599624886234190511.sh
+ echo 'Building Flink job Server'
Building Flink job Server
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability -Pdocker-tag=latest :runners:flink:1.7:job-server-container:docker
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:flink:1.7:job-server:processResources NO-SOURCE
> Task :runners:flink:1.7:job-server-container:dockerClean UP-TO-DATE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :runners:flink:1.7:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:build-tools:jar
> Task :sdks:java:core:processResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar

> Task :runners:flink:1.7:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:flink:1.7:classes
> Task :runners:flink:1.7:jar
> Task :runners:flink:1.7:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.7:job-server:classes UP-TO-DATE
> Task :runners:flink:1.7:job-server:shadowJar
> Task :runners:flink:1.7:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.7:job-server-container:dockerPrepare
> Task :runners:flink:1.7:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 12s
51 actionable tasks: 36 executed, 14 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/vei3itzmf35vk

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4932891223022681506.sh
+ echo 'Tagging Flink Job Server'\''s image...'
Tagging Flink Job Server's image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3074348528989479345.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4681241078290322714.sh
+ echo 'Pushing Flink Job Server'\''s image...'
Pushing Flink Job Server's image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4119794298124858382.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink-job-server]
5dbdf2f59b65: Preparing
9030d30719ba: Preparing
8eb3897a1e6d: Preparing
f7d12d471667: Preparing
f350d0146bb3: Preparing
e38df31d449c: Preparing
af5ae4841776: Preparing
b17cc31e431b: Preparing
12cb127eee44: Preparing
604829a174eb: Preparing
fbb641a8b943: Preparing
e38df31d449c: Waiting
af5ae4841776: Waiting
604829a174eb: Waiting
b17cc31e431b: Waiting
fbb641a8b943: Waiting
12cb127eee44: Waiting
f350d0146bb3: Layer already exists
f7d12d471667: Layer already exists
e38df31d449c: Layer already exists
af5ae4841776: Layer already exists
12cb127eee44: Layer already exists
b17cc31e431b: Layer already exists
fbb641a8b943: Layer already exists
604829a174eb: Layer already exists
5dbdf2f59b65: Pushed
8eb3897a1e6d: Pushed
9030d30719ba: Pushed
latest: digest: sha256:20f424c4757e4760c04405aa685888a9d8ca52f0fdf38c153eb5ac7ce5c5383f size: 2632
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
CLUSTER_NAME=beam-loadtests-python-gbk-flink-batch-6
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-6
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5682679076038970556.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3643620179752773783.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc/>
+ ./create_flink_cluster.sh
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-6-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ main
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                -Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
- [2 files][  6.0 KiB/ 13.1 KiB]                                                - [3 files][ 13.1 KiB/ 13.1 KiB]                                                
Operation completed over 3 objects/13.1 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-6 --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/44dd4f66-7966-362e-b418-164dcf3c0f91].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/44dd4f66-7966-362e-b418-164dcf3c0f91] failed: Initialization action failed. Failed action 'gs://beam-flink-cluster/init-actions/docker.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/19bb949c-b088-4ecf-8a17-e1205c3eeba7/beam-loadtests-python-gbk-flink-batch-6-w-5/dataproc-initialization-script-0_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_GBK_Flink_Batch #7

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/7/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org