You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/04/07 08:52:34 UTC

Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #133

See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/133/display/redirect?page=changes>

Changes:

[heejong] [BEAM-12051] fix target/go-licenses: no such file or directory

[MATTHEW.Ouyang] [BEAM-12059] include literal T in DATETIME format

[Kyle Weaver] [BEAM-11483] Ignore windowed GBK tests in Spark portable streaming.

[suztomo] protobuf-java to be in-line with libraries-bom 16.3.0

[suztomo] Removing unused variable google_auth_version

[Kyle Weaver] [BEAM-12095] Fix Spark job server in uber jar path as well.

[kawaigin] [BEAM-12096] Attempt to fix flaky test

[kawaigin] Added logging of potential ImportError

[kawaigin] Use PropertyMock to replace the global singleton current_env()

[noreply] [BEAM-7372] Remove dead py<3.6 paths (#14436)

[kawaigin] Changed warning logs about not in REPL env to error level and fixed a

[noreply] [BEAM-9547] Raise WontImplementError for a few more operations (#14330)

[noreply] [BEAM-11544] BQML pattern (#13644)

[noreply] [BEAM-11574] Enable cross-language integration tests on Dataflow

[noreply] [BEAM-11585] Select.flattenedSchema doesn't flatten nested array fields

[noreply] Updating Go tests on PR template. (#14442)

[noreply] [BEAM-7372] cleanup codes for py2 compatibility from

[noreply] Merge pull request #14388 from [BEAM-7372] remove codes for py2

[noreply] Merge pull request #14365 from [BEAM-10884] - Adding tests to


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.7.4'
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3216fcb25287448dca3e78a2fd48aee9ac6422a3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3216fcb25287448dca3e78a2fd48aee9ac6422a3 # timeout=10
Commit message: "Merge pull request #14412 from [BEAM-8696] protobuf-java 3.14.0 in line with libraries BOM 16.3.0"
 > git rev-list --no-walk bcced0cf3202829eed2152a8eeafaa0e159645e6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
CLUSTER_NAME=beam-loadtests-go-combine-flink-batch-133
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-go-combine-flink-batch-133
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins632093707593018239.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins237918862635725594.sh
+ cd <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-go-combine-flink-batch-133-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.5 KiB]                                                / [3 files][ 13.5 KiB/ 13.5 KiB]                                                
Operation completed over 3 objects/13.5 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_****s=6
+ gcloud dataproc clusters create beam-loadtests-go-combine-flink-batch-133 --region=global --num-****s=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/8b11d5d7-2120-3849-ad49-16d01c827b00].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/8b11d5d7-2120-3849-ad49-16d01c827b00] failed: Multiple Errors:
 - Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/5b1a5cf7-69db-4082-b7ed-6a2c7cb8fc9e/beam-loadtests-go-combine-flink-batch-133-m/dataproc-initialization-script-2_output
 - Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/5b1a5cf7-69db-4082-b7ed-6a2c7cb8fc9e/beam-loadtests-go-combine-flink-batch-133-w-0/dataproc-initialization-script-2_output
 - Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/5b1a5cf7-69db-4082-b7ed-6a2c7cb8fc9e/beam-loadtests-go-combine-flink-batch-133-w-1/dataproc-initialization-script-2_output
 - Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/5b1a5cf7-69db-4082-b7ed-6a2c7cb8fc9e/beam-loadtests-go-combine-flink-batch-133-w-5/dataproc-initialization-script-2_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Go_Combine_Flink_Batch #136

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/136/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #135

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/135/display/redirect?page=changes>

Changes:

[fabien.caylus] [BEAM-12012] Add API key & token authentication in ElasticsearchIO

[fabien.caylus] Simplify arguments checks

[Boyuan Zhang] Change PubSubSource and PubSubSink translation to avoid special

[Andrew Pilloud] Complex Type Passthrough Test

[Andrew Pilloud] Don't use base types in BeamCalcRel

[Kyle Weaver] [BEAM-10925] Refactor ZetaSqlJavaUdfTypeTest.

[Andrew Pilloud] Use correct schema geters, enforce types

[Boyuan Zhang] SDF bounded wrapper returns None when any exception happen in the

[Steve Niemitz] [BEAM-12126] Fix DirectRunner not respecting use_deprecated_reads

[randomstep] [BEAM-12092] Bump jedis to 3.5.2

[noreply] [BEAM-11227] Try reverting #14295: Moving from vendored gRPC 1.26 to

[noreply] Merge pull request #14446 from [BEAM-10854] Fix PeriodicImpulse for

[noreply] Turn on mpyp checks for filesystem (#14425)

[Andrew Pilloud] Rename functions, add comments

[noreply] [BEAM-12112] Disable streaming mode for PORTABILITY_BATCH (#14452)

[noreply] [BEAM-9547] Implementations for a few more DataFrame operations (#14362)

[heejong] [BEAM-12141] Print sha256 and size when downloading artifacts via

[noreply] [BEAM-12128] replace usage of snippets_test_py3.py to snippets_test.py


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.7.4'
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 974c2de8a45fbe6dae9cf3ad4b1d6a2327f0b9a3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 974c2de8a45fbe6dae9cf3ad4b1d6a2327f0b9a3 # timeout=10
Commit message: "Merge pull request #14492 from ihji/BEAM-12141"
 > git rev-list --no-walk 572a99bab07e53e043887243e2b1e69120563be5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
CLUSTER_NAME=beam-loadtests-go-combine-flink-batch-135
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-go-combine-flink-batch-135
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins794200582965880255.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1793963260170852462.sh
+ cd <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-go-combine-flink-batch-135-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.5 KiB]                                                / [3 files][ 13.5 KiB/ 13.5 KiB]                                                
Operation completed over 3 objects/13.5 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_****s=6
+ gcloud dataproc clusters create beam-loadtests-go-combine-flink-batch-135 --region=global --num-****s=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/97fa3ea6-10e1-3d03-a675-baa9946c9652].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/97fa3ea6-10e1-3d03-a675-baa9946c9652] failed: Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/4e2cb3a6-de2d-4b41-8a15-58f677a3acf3/beam-loadtests-go-combine-flink-batch-135-w-5/dataproc-initialization-script-2_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #134

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/134/display/redirect?page=changes>

Changes:

[noreply] Update WriteToBigQuery multiple destinations doc

[Ismaël Mejía] [BEAM-11948] Drop support for Flink 1.8 and 1.9

[randomstep] [BEAM-11900] Bump libthrift to 0.14.0

[randomstep] [BEAM-11900] Bump libthrift to 0.14.1

[Kyle Weaver] [BEAM-10925] Java UDF type tests for input refs.

[randomstep] [BEAM-12066] Bump classgraph to 4.8.104

[Kyle Weaver] [BEAM-12102] Catch and rethrow Calcite CannotPlanException.

[Kyle Weaver] [BEAM-12095] Add unit tests for path_to_beam_jar(artifact_id).

[kawaigin] [BEAM-10708] Read/Write Intermediate PCollections

[kawaigin] Fix lint

[kawaigin] Fix based on comments

[kawaigin] Added clear method to InMemoryCache because tests might be flaky when a

[noreply] [BEAM-11961] InfluxDBIOIT failing with unauthorized error (#14215)

[noreply] Add DataFrame API changes to CHANGES.md (#14454)

[noreply] Fix: Allow BigQuery tableIds with hyphens (#14125)

[noreply] Merge pull request #14394 from [BEAM-11277] Add method to check if two

[kawaigin] Avoid using interactive_environment module in the test because

[noreply] [BEAM-449] Support PCollectionList in PAssert (#14322)

[kawaigin] [BEAM-11045] Avoid broken deps

[kawaigin] Added back the setUp as additional cleanup routine before each test.

[noreply] [BEAM-11742] Use pyarrow schema instead column names when creating

[noreply] [BEAM-7372] remove usage of future package and unnecessary builtins

[noreply] [BEAM-7372] cleanup codes for py2 compatibility from


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.7.4'
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 572a99bab07e53e043887243e2b1e69120563be5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 572a99bab07e53e043887243e2b1e69120563be5 # timeout=10
Commit message: "Merge pull request #14203: [BEAM-11948] Drop support for Flink 1.8 and 1.9"
 > git rev-list --no-walk 3216fcb25287448dca3e78a2fd48aee9ac6422a3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
CLUSTER_NAME=beam-loadtests-go-combine-flink-batch-134
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-go-combine-flink-batch-134
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2983638859365455649.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2880989016862136895.sh
+ cd <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-go-combine-flink-batch-134-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.5 KiB]                                                / [3 files][ 13.5 KiB/ 13.5 KiB]                                                
Operation completed over 3 objects/13.5 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_****s=6
+ gcloud dataproc clusters create beam-loadtests-go-combine-flink-batch-134 --region=global --num-****s=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/eeb9cb08-f5ca-328e-992b-b6cfb3a97a2d].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/eeb9cb08-f5ca-328e-992b-b6cfb3a97a2d] failed: Multiple Errors:
 - Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/d9bcc558-b89c-4098-bbe7-d0276bff10a2/beam-loadtests-go-combine-flink-batch-134-m/dataproc-initialization-script-2_output
 - Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/d9bcc558-b89c-4098-bbe7-d0276bff10a2/beam-loadtests-go-combine-flink-batch-134-w-0/dataproc-initialization-script-2_output
 - Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/d9bcc558-b89c-4098-bbe7-d0276bff10a2/beam-loadtests-go-combine-flink-batch-134-w-1/dataproc-initialization-script-2_output
 - Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/d9bcc558-b89c-4098-bbe7-d0276bff10a2/beam-loadtests-go-combine-flink-batch-134-w-5/dataproc-initialization-script-2_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org