You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/25 13:11:55 UTC

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #13

See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/13/display/redirect?page=changes>

Changes:

[zyichi] Fix erros in py sdk io utils and add unit tests

[gunnar.schulze] [BEAM-7572] ApproximateUnique.Globally and ApproximateUnique.PerKey

[lukasz.gajowy] [BEAM-7307] Fix load tests failures due to mistakes in PR 8881

[markliu] [BEAM-7598] Do not build Python tar file in run_integration_test.sh

[dcavazos] Add Python snippet for ParDo transform

[aaltay] BEAM-7141: add key value timer callback (#8739)

[iemejia] [BEAM-7606] Fix JDBC time conversion tests

[robertwb] Update portable schema representation and java SchemaTranslation (#8853)

------------------------------------------
[...truncated 51.22 KB...]
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :runners:flink:1.7:job-server:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :runners:flink:1.7:job-server-container:dockerClean UP-TO-DATE
> Task :runners:flink:1.7:processResources
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:core:processResources
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar

> Task :runners:flink:1.7:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:flink:1.7:classes
> Task :runners:flink:1.7:jar
> Task :runners:flink:1.7:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.7:job-server:classes UP-TO-DATE
> Task :runners:flink:1.7:job-server:shadowJar
> Task :runners:flink:1.7:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.7:job-server-container:dockerPrepare
> Task :runners:flink:1.7:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 9s
51 actionable tasks: 36 executed, 14 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/p2dx4y4qny3jw

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4043150821393621796.sh
+ echo 'Tagging Flink Job Server'\''s image...'
Tagging Flink Job Server's image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins214588678489836445.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8358274717040240210.sh
+ echo 'Pushing Flink Job Server'\''s image...'
Pushing Flink Job Server's image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5127801992561314868.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink-job-server]
76c0949ed127: Preparing
e0ea9044d1d4: Preparing
ffcf63adeb5d: Preparing
f7d12d471667: Preparing
f350d0146bb3: Preparing
e38df31d449c: Preparing
af5ae4841776: Preparing
b17cc31e431b: Preparing
12cb127eee44: Preparing
604829a174eb: Preparing
fbb641a8b943: Preparing
12cb127eee44: Waiting
af5ae4841776: Waiting
604829a174eb: Waiting
b17cc31e431b: Waiting
fbb641a8b943: Waiting
e38df31d449c: Waiting
f350d0146bb3: Layer already exists
f7d12d471667: Layer already exists
e38df31d449c: Layer already exists
af5ae4841776: Layer already exists
b17cc31e431b: Layer already exists
12cb127eee44: Layer already exists
604829a174eb: Layer already exists
fbb641a8b943: Layer already exists
76c0949ed127: Pushed
ffcf63adeb5d: Pushed
e0ea9044d1d4: Pushed
latest: digest: sha256:5a1529357dd68d017bdd846d5961a784ed8e5b2604e3a1eab4e12458524d2784 size: 2632
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
CLUSTER_NAME=beam-loadtests-python-gbk-flink-batch-13
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-13
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8998007878906450872.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins11794166681641648.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc/>
+ ./create_flink_cluster.sh
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-13-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ main
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.1 KiB]                                                / [3 files][ 13.1 KiB/ 13.1 KiB]                                                -
Operation completed over 3 objects/13.1 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-13 --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/1172fd5c-0ab0-38c6-8eaa-ddd983e01be7].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
...............................................................................................................................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-13] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-13-m '--command=yarn application -list'
++ grep beam-loadtests-python-gbk-flink-batch-13
Warning: Permanently added 'compute.886665643640726171' (ECDSA) to the list of known hosts.
19/06/25 13:11:49 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-gbk-flink-batch-13-m/10.128.0.179:8032
+ read line
+ echo

++ echo
++ sed 's/ .*//'
+ application_ids[$i]=
++ echo
++ sed 's/.*beam-loadtests-python-gbk-flink-batch-13/beam-loadtests-python-gbk-flink-batch-13/'
++ sed 's/ .*//'
+ application_masters[$i]=
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=
+ echo 'Using Yarn Application master: '
Using Yarn Application master: 
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-13-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest --flink-master-url= --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-13'
1476c3515f39241a776047f9939bb81a29749376188360c14f3d7cae3ffd3251
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-gbk-flink-batch-13-m '--command=curl -s "http:///jobmanager/config"'
+ local job_server_config=
+ local key=jobmanager.rpc.port
++ echo
++ cut -d : -f1
+ local yarn_application_master_host=
++ echo
++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python2.7/json/__init__.py", line 291, in load
    **kw)
  File "/usr/lib/python2.7/json/__init__.py", line 339, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
+ local jobmanager_rpc_port=
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-13-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-13-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-13-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_GBK_Flink_Batch #14

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/14/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org