You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/12/09 13:03:48 UTC

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #178

See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/178/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-8895] Add BigQuery table name sanitization to BigQueryIOIT

[michal.walenia] [BEAM-8918] Split batch BQIOIT into avro and json using tests


------------------------------------------
[...truncated 44.90 KB...]
5ab1d57a3e86: Waiting
e4b20fcc48f4: Preparing
cbdd44c5cbac: Waiting
6f029f7fa589: Waiting
d7e1d9fd3a06: Waiting
5f3a5adb8e97: Waiting
73bfa217d66f: Waiting
91ecdd7165d3: Waiting
62df42ca4929: Pushed
83129c84d198: Pushed
009676ba5dcf: Pushed
5ab1d57a3e86: Pushed
4e8dc3d5a8aa: Pushed
c28383f3aac4: Layer already exists
22b7de8c8281: Layer already exists
6f029f7fa589: Layer already exists
cbdd44c5cbac: Layer already exists
2e517d68c391: Layer already exists
5f3a5adb8e97: Layer already exists
73bfa217d66f: Layer already exists
91ecdd7165d3: Layer already exists
e4b20fcc48f4: Layer already exists
d8d19cc2b50c: Pushed
dcacd5b10841: Pushed
01b0bcbff033: Pushed
d7e1d9fd3a06: Pushed
latest: digest: sha256:fdf0605113ff01561c5c16cbdc3902caec13625d9e46122dc1876604c8308fef size: 4110
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability -Pdocker-tag=latest :runners:flink:1.9:job-server-container:docker
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :runners:flink:1.9:job-server:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :runners:flink:1.9:job-server-container:dockerClean UP-TO-DATE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :model:job-management:processResources
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :runners:flink:1.9:processResources
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:core:processResources
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:io:kafka:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :sdks:java:fn-execution:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.9:compileJava FROM-CACHE
> Task :runners:flink:1.9:classes
> Task :runners:flink:1.9:jar
> Task :runners:flink:1.9:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.9:job-server:classes UP-TO-DATE
> Task :runners:flink:1.9:job-server:shadowJar
> Task :runners:flink:1.9:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.9:job-server-container:dockerPrepare
> Task :runners:flink:1.9:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 2s
58 actionable tasks: 40 executed, 17 from cache, 1 up-to-date

Publishing build scan...
https://scans.gradle.com/s/ds6e6kub25dla

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7538049303587255745.sh
+ echo 'Tagging image...'
Tagging image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6033442197788811125.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins323221892394726055.sh
+ echo 'Pushing image...'
Pushing image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1981447329472621070.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink-job-server]
ba3a6aafc351: Preparing
0cd1349412c5: Preparing
df26ae0e04ad: Preparing
2ee490fbc316: Preparing
b18043518924: Preparing
9a11244a7e74: Preparing
5f3a5adb8e97: Preparing
73bfa217d66f: Preparing
91ecdd7165d3: Preparing
e4b20fcc48f4: Preparing
9a11244a7e74: Waiting
e4b20fcc48f4: Waiting
91ecdd7165d3: Waiting
5f3a5adb8e97: Waiting
73bfa217d66f: Waiting
2ee490fbc316: Layer already exists
b18043518924: Layer already exists
5f3a5adb8e97: Layer already exists
9a11244a7e74: Layer already exists
73bfa217d66f: Layer already exists
91ecdd7165d3: Layer already exists
e4b20fcc48f4: Layer already exists
ba3a6aafc351: Pushed
df26ae0e04ad: Pushed
0cd1349412c5: Pushed
latest: digest: sha256:58a15b687c1085ab9aa1cad242d6bdaf93fe1b6463936610f5153c02bb40703a size: 2427
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
CLUSTER_NAME=beam-loadtests-python-gbk-flink-batch-178
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
FLINK_NUM_WORKERS=16
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-178
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2516560867547813214.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4612399426403944217.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-178-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                -- [3 files][ 13.4 KiB/ 13.4 KiB]                                                
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=17
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-178 --region=global --num-workers=17 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest, --image-version=1.2 --zone=us-central1-a --quiet
ERROR: (gcloud.dataproc.clusters.create) INVALID_ARGUMENT: Insufficient 'CPUS' quota. Requested 72.0, available 2.0.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_GBK_Flink_Batch #235

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/235/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #234

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/234/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-9236] Mark missing Schema based classes and methods as

[iemejia] [BEAM-9236] Remove unneeded schema related class FieldValueSetterFactory

[iemejia] [BEAM-9236] Remove unused schema related class FieldValueGetterFactory


------------------------------------------
[...truncated 268.96 KB...]
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0203124159"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:use_grpc_for_gcs:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: b17ef5b6da8f9b77014ca279fc0bfb94)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1580736655061_0001_01_000004 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1580736655061_0001_01_000004 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 76.201s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 25s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3y4b45rlxuxrq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Python_GBK_Flink_Batch - Build # 233 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #233)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/233/ to view the results.

beam_LoadTests_Python_GBK_Flink_Batch - Build # 232 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #232)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/232/ to view the results.

beam_LoadTests_Python_GBK_Flink_Batch - Build # 231 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #231)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/231/ to view the results.

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #230

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/230/display/redirect?page=changes>

Changes:

[kmj] Update BQ Storage API documentation

[chadrik] [BEAM-7746] Silence a bunch of errors about "Cannot instantiate abstract

[mxm] [BEAM-9161] Ensure non-volatile access of field variables by processing

[github] Merge pull request #10680 from Indefinite retries to wait for a BQ Load

[chamikara] Fixes an issue where FileBasedSink may suppress exceptions.

[github] [BEAM-7847] enabled to generate SDK docs with Python3 (#10141)

[ankurgoenka] [BEAM-9220] Adding argument use_runner_v2 for dataflow unified worker

[suztomo] Linkage Checker 1.1.3


------------------------------------------
[...truncated 269.15 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0130105745"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 0af9ccb3901f30c61ce79c9caa48933e)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1580391228040_0001_01_000006 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1580391228040_0001_01_000006 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 64.277s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 12s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/snmjyd3awhsza

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #229

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/229/display/redirect?page=changes>

Changes:

[iambruceactor] added more meetups

[suztomo] Google-cloud-clients to use 2019 versions

[lcwik] [BEAM-8298] Fully specify the necessary details to support side input

[chadrik] [BEAM-7746] Introduce a protocol to handle various types of partitioning

[iemejia] [BEAM-6957] Enable Counter/Distribution metrics tests for Portable Spark

[kcweaver] [BEAM-9200] fix portable jar test version property

[iemejia] [BEAM-9204] Refactor HBaseUtils methods to depend on Ranges

[iemejia] [BEAM-9204] Fix HBase SplitRestriction to be based on provided Range

[echauchot] [BEAM-9205] Add ValidatesRunner annotation to the MetricsPusherTest

[echauchot] [BEAM-9205] Fix validatesRunner tests configuration in spark module

[jbonofre] [BEAM-7427] Refactore JmsCheckpointMark to be usage via Coder

[iemejia] [BEAM-7427] Adjust JmsIO access levels and other minor fixes

[pabloem] Merge pull request #10346 from [BEAM-7926] Data-centric Interactive

[chamikara] Fix Spanner auth endpoints

[chadrik] [BEAM-7746] Stop automatically creating staticmethods in register_urn


------------------------------------------
[...truncated 109.67 KB...]

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:244: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.20.0.dev' to '2.20.0.dev0'
  normalized_version,
running nosetests
running egg_info
Skipping proto regeneration: all files up to date
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 270.184s

OK

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 7m 4s
5 actionable tasks: 4 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/aakjgkkdqmniw

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1900685713506604888.sh
+ echo src Load test: fanout 8 times with 2GB 10-byte records total src
src Load test: fanout 8 times with 2GB 10-byte records total src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey -Prunner=PortableRunner '-PloadTest.args=--job_name=load_tests_Python_Flink_Batch_GBK_5_0129100404 --publish_to_big_query=true --project=apache-beam-testing --metrics_dataset=load_test --metrics_table=python_flink_batch_GBK_5 --input_options='{"num_records": 2500000,"key_size": 10,"value_size":90}' --iterations=1 --fanout=8 --parallelism=16 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --environment_type=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:python:setupVirtualenv UP-TO-DATE
> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv UP-TO-DATE

> Task :sdks:python:sdist
setup.py:244: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.20.0.dev' to '2.20.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Processing <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/build/apache-beam.tar.gz>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.26.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.16.6)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.10.1)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.11.2)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.6.8)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.15.1)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.7.4.1)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.7.4.1)
Requirement already satisfied: cachetools<4,>=3.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.1.1)
Requirement already satisfied: google-apitools<0.5.29,>=0.5.28 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.5.28)
Requirement already satisfied: google-cloud-datastore<1.8.0,>=1.7.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.7.4)
Requirement already satisfied: google-cloud-pubsub<1.1.0,>=0.39.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.0.2)
Requirement already satisfied: google-cloud-bigquery<1.18.0,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.17.1)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.2.0)
Requirement already satisfied: google-cloud-bigtable<1.1.0,>=0.31.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.0.0)
Requirement already satisfied: google-cloud-spanner>=1.7.1<1.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.13.0)
Requirement already satisfied: grpcio-gcp<1,>=0.2.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.2.2)
Requirement already satisfied: googledatastore<7.1,>=7.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (7.0.2)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.90.4)
Requirement already satisfied: freezegun>=0.3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.3.14)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.8.0,>=0.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.7.1)
Requirement already satisfied: pyhamcrest!=1.10.0,<2.0.0,>=1.9 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.10.1)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (5.3)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (4.6.9)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.31.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.20.0.dev0) (1.14.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.20.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.20.0.dev0) (5.4.4)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.20.0.dev0) (4.0)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.20.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.20.0.dev0) (0.2.8)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.20.0.dev0) (44.0.0)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.20.0.dev0) (2.4.6)
Requirement already satisfied: fasteners>=0.14 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.20.0.dev0) (0.15)
Requirement already satisfied: google-api-core[grpc]<2.0.0dev,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.20.0.dev0) (1.16.0)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.20.0.dev0) (0.12.3)
Requirement already satisfied: google-resumable-media<0.5.0dev,>=0.3.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.20.0.dev0) (0.4.1)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.20.0.dev0) (1.51.0)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.20.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.3.0)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.13.1)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (20.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (19.3.0)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.5.0)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.1.8)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (5.0.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (2.3.5)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.8.1)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.7.1)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (2019.11.28)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (2.8)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (1.25.8)
Requirement already satisfied: google-auth<2.0dev,>=0.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.20.0.dev0) (1.11.0)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.1.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.5)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.20.0.dev0-py2-none-any.whl size=1884808 sha256=02667d4435fa4a0626cca59ee750fb109ac33a5c08c36a01eabb6a874ebe46e6
  Stored in directory: /home/jenkins/.cache/pip/wheels/8f/2c/77/dfc39e2134b1dcf200c3972c98ea93a5f510bd86e811dc94bc
Successfully built apache-beam
Installing collected packages: apache-beam
  Attempting uninstall: apache-beam
    Found existing installation: apache-beam 2.20.0.dev0
    Uninstalling apache-beam-2.20.0.dev0:
      Successfully uninstalled apache-beam-2.20.0.dev0
Successfully installed apache-beam-2.20.0.dev0

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:244: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.20.0.dev' to '2.20.0.dev0'
  normalized_version,
running nosetests
running egg_info
Skipping proto regeneration: all files up to date
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 210.457s

OK

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 3m 43s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/j4embpb75ocw6

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins9141843443658375720.sh
+ echo Changing number of workers to 5
Changing number of workers to 5
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
FLINK_NUM_WORKERS=5

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7463152772772434302.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh restart
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-229-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ restart
+ delete
+ gcloud dataproc clusters delete beam-loadtests-python-gbk-flink-batch-229 --region=global --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/0e54f40d-45ff-3456-ada0-e47fba933edb].
Waiting for cluster deletion operation...
...........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.delete) Operation [projects/apache-beam-testing/regions/global/operations/0e54f40d-45ff-3456-ada0-e47fba933edb] timed out.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #228

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/228/display/redirect?page=changes>

Changes:

[kcweaver] [DO NOT MERGE][BEAM-9177] Update Flink runner webpage for 2.18

[kcweaver] Update Beam version chart

[robinyqiu] Turn on BeamZetaSqlCalcRel

[pawel.pasterz] [BEAM-8941] Implement simple DSL for load tests

[iemejia] [website] Add warning on Beam 2.18.0 blog post for Avro 1.9.0 users

[github] [BEAM-9183, BEAM-9026] Initialize and cleanup the state of

[tvalentyn] [BEAM-9184] Add ToSet combiner (#10636)

[github] Fixing Lint

[github] [BEAM-9201] Release scripts fixes: run_rc_validation.sh,

[altay] Change Dataflow Python containers

[angoenka] [BEAM-8626] Implement status fn api handler in python sdk (#10598)

[tvalentyn] [BEAM-9186] Allow injection of custom equality function. (#10637)


------------------------------------------
[...truncated 145.42 KB...]
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "16"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 03506ee54b93c431672508fee7214c8f)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: org.apache.flink.runtime.jobmanager.scheduler.NoResourceAvailableException: Could not allocate enough slots to run the job. Please make sure that the cluster has enough resources.
	at org.apache.flink.runtime.executiongraph.Execution.lambda$scheduleForExecution$0(Execution.java:460)
	at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
	at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
	at org.apache.flink.runtime.jobmaster.slotpool.SchedulerImpl.lambda$internalAllocateSlot$0(SchedulerImpl.java:190)
	at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
	at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
	at org.apache.flink.runtime.jobmaster.slotpool.SlotSharingManager$SingleTaskSlot.release(SlotSharingManager.java:700)
	at org.apache.flink.runtime.jobmaster.slotpool.SlotSharingManager$MultiTaskSlot.release(SlotSharingManager.java:484)
	at org.apache.flink.runtime.jobmaster.slotpool.SlotSharingManager$MultiTaskSlot.lambda$new$0(SlotSharingManager.java:380)
	at java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
	at java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:811)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
	at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
	at org.apache.flink.runtime.concurrent.FutureUtils$Timeout.run(FutureUtils.java:998)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: org.apache.flink.runtime.jobmanager.scheduler.NoResourceAvailableException: Could not allocate enough slots to run the job. Please make sure that the cluster has enough resources.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 309.500s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 49s
5 actionable tasks: 4 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/w4r52e5rcyz2g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #227

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/227/display/redirect>

Changes:


------------------------------------------
[...truncated 268.56 KB...]
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0127110333"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 5edfd159e31d751662226733cd045654)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.ExecutionException: org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
	at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Status.asRuntimeException(Status.java:524)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:275)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closedInternal(ServerCallImpl.java:353)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:341)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:867)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

root: ERROR: org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 52.112s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 1s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rcifa2se4wvny

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #226

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/226/display/redirect>

Changes:


------------------------------------------
[...truncated 268.46 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0126100254"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 08bfe4e82e3a8105d712c7efc542107d)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1580044522340_0001_01_000002  timed out.
	at org.apache.flink.runtime.resourcemanager.ResourceManager$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(ResourceManager.java:1146)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1580044522340_0001_01_000002  timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 120.128s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 8s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jtwi75fn7ltia

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #225

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/225/display/redirect?page=changes>

Changes:

[keijiy] Fix comments on bigquery.py * beam.io.gcp.WriteToBigQuery ->

[kirillkozlov] Use custom escape method

[kirillkozlov] Inline, link JIRA

[elias.djurfeldt] Added valueprovider support for Datastore query namespaces

[iemejia] [BEAM-9170] Unify Jenkins job names related to Java 11 testing

[github] Add a minor comment.

[aaltay] Bump tensorflow from 1.14.0 to 1.15.0 in /sdks/python/container (#10392)

[github] [BEAM-9093] Log invalid overwrites in pipeline options (#10613)

[github] [BEAM-8492] Allow None, Optional return hints for DoFn.process and


------------------------------------------
[...truncated 268.81 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0125100253"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 31d995ad4e048470ed3e7ab2c7685714)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1579958040974_0001_01_000006 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1579958040974_0001_01_000006 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 78.327s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 27s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/c7usy5zad6myy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #224

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/224/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-9065] Reset MetricsContainerStepMapAccumulator upon initialization

[github] Update release guide with extended information.

[github] Add link to mass comment script

[github] Fix typo

[github] Address comments

[github] Update twine install details.

[kirillkozlov] Create initial DataStoreV1Table

[kirillkozlov] Added tests

[kirillkozlov] Implement getTableStatistics

[kirillkozlov] buildIOWriter

[kirillkozlov] Read other types

[kirillkozlov] Better conversion, support complex types

[kirillkozlov] Store DataStore key as VARBINARY

[kirillkozlov] Wrap DoFns in PTransforms

[kirillkozlov] Table property for specifying key field

[kirillkozlov] JavaDoc

[kirillkozlov] Infer schema for RowToEntity

[kirillkozlov] Better conversion performance

[kirillkozlov] Mark Table as `Internal` and `Experimental`

[kirillkozlov] Review changes

[kirillkozlov] Add IT that does not rely on SQL

[kirillkozlov] fix style

[github] cleanup typo

[github] [BEAM-3419, BEAM-9173] Add TODO comment

[iemejia] [BEAM-9172] Add missing parameter to Nexmark CI execution for Flink

[github] Chnage dataflow container version

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[iemejia] Make Spark use of batch based invocation in Nexmark CI jobs cleaner

[ehudm] Update beam website for 2.18.0

[apilloud] [BEAM-9027] Unparse LIKE as binary operator

[github] Blog post for release 2.18.0 (#10575)

[kcweaver] upgrade auto-value to version 1.7

[kcweaver] [BEAM-9149] Add SQL query parameters to public API and enable positional

[bhulette] [BEAM-9072] [SQL] Primitive types should fall through (#10681)


------------------------------------------
[...truncated 112.72 KB...]

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 5m 40s
5 actionable tasks: 4 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/xhg6b3daoirhy

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2823664162809835056.sh
+ echo src Load test: fanout 8 times with 2GB 10-byte records total src
src Load test: fanout 8 times with 2GB 10-byte records total src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey -Prunner=PortableRunner '-PloadTest.args=--job_name=load_tests_Python_Flink_Batch_GBK_5_0124100258 --publish_to_big_query=true --project=apache-beam-testing --metrics_dataset=load_test --metrics_table=python_flink_batch_GBK_5 --input_options='{"num_records": 2500000,"key_size": 10,"value_size":90}' --iterations=1 --fanout=8 --parallelism=16 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --environment_type=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:python:setupVirtualenv UP-TO-DATE
> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv UP-TO-DATE

> Task :sdks:python:sdist
setup.py:244: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.20.0.dev' to '2.20.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Processing <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/build/apache-beam.tar.gz>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.26.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.16.6)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.10.1)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.11.2)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.6.8)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.15.1)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.7.4.1)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.7.4.1)
Requirement already satisfied: cachetools<4,>=3.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.1.1)
Requirement already satisfied: google-apitools<0.5.29,>=0.5.28 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.5.28)
Requirement already satisfied: google-cloud-datastore<1.8.0,>=1.7.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.7.4)
Requirement already satisfied: google-cloud-pubsub<1.1.0,>=0.39.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.0.2)
Requirement already satisfied: google-cloud-bigquery<1.18.0,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.17.1)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.2.0)
Requirement already satisfied: google-cloud-bigtable<1.1.0,>=0.31.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.0.0)
Requirement already satisfied: google-cloud-spanner>=1.7.1<1.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.13.0)
Requirement already satisfied: grpcio-gcp<1,>=0.2.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.2.2)
Requirement already satisfied: googledatastore<7.1,>=7.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (7.0.2)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.90.4)
Requirement already satisfied: freezegun>=0.3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.3.14)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.8.0,>=0.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.7.1)
Requirement already satisfied: pyhamcrest!=1.10.0,<2.0.0,>=1.9 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.10.1)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (5.3)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (4.6.9)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.31.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.20.0.dev0) (1.14.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.20.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.20.0.dev0) (5.4.4)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.20.0.dev0) (4.0)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.20.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.20.0.dev0) (0.2.8)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.20.0.dev0) (44.0.0)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.20.0.dev0) (2.4.6)
Requirement already satisfied: fasteners>=0.14 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.20.0.dev0) (0.15)
Requirement already satisfied: google-api-core[grpc]<2.0.0dev,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.20.0.dev0) (1.16.0)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.20.0.dev0) (0.12.3)
Requirement already satisfied: google-resumable-media<0.5.0dev,>=0.3.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.20.0.dev0) (0.4.1)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.20.0.dev0) (1.51.0)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.20.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.3.0)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.13.1)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (20.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (19.3.0)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.4.0)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.1.8)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (5.0.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (2.3.5)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.8.1)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.7.1)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (2019.11.28)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (2.8)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (1.25.8)
Requirement already satisfied: google-auth<2.0dev,>=0.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.20.0.dev0) (1.11.0)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.0.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.5)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.20.0.dev0-py2-none-any.whl size=1874089 sha256=33f539c27ff030b0e47781bb70eaaa7b2036bc5c9b84f5f05fd4ea3e0cd8c8fb
  Stored in directory: /home/jenkins/.cache/pip/wheels/8f/2c/77/dfc39e2134b1dcf200c3972c98ea93a5f510bd86e811dc94bc
Successfully built apache-beam
Installing collected packages: apache-beam
  Attempting uninstall: apache-beam
    Found existing installation: apache-beam 2.20.0.dev0
    Uninstalling apache-beam-2.20.0.dev0:
      Successfully uninstalled apache-beam-2.20.0.dev0
Successfully installed apache-beam-2.20.0.dev0

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:244: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.20.0.dev' to '2.20.0.dev0'
  normalized_version,
running nosetests
running egg_info
Skipping proto regeneration: all files up to date
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 210.529s

OK

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 3m 42s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qukgbzaq2my54

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3383807054582989967.sh
+ echo Changing number of workers to 5
Changing number of workers to 5
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
FLINK_NUM_WORKERS=5

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4405177933407680151.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh restart
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-224-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ restart
+ delete
+ gcloud dataproc clusters delete beam-loadtests-python-gbk-flink-batch-224 --region=global --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/729ca54d-8af4-3d16-812b-96e39a0e24a9].
Waiting for cluster deletion operation...
......................................................................done.
Deleted [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-224].
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                -
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-224 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/f581e27e-726a-3f5f-8f23-a89316eaf371].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
......................................................................................................................................................................................................................................................WARNING: Cluster beam-loadtests-python-gbk-flink-batch-224 failed to create. Beginning automated resource cleanup process.
done.
ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/f581e27e-726a-3f5f-8f23-a89316eaf371] failed: Initialization action failed. Failed action 'gs://beam-flink-cluster/init-actions/docker.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/c204f06a-86bf-4d71-b5df-ba8a710940d0/beam-loadtests-python-gbk-flink-batch-224-w-5/dataproc-initialization-script-0_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Python_GBK_Flink_Batch - Build # 223 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #223)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/223/ to view the results.

beam_LoadTests_Python_GBK_Flink_Batch - Build # 222 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #222)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/222/ to view the results.

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #221

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/221/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-9153] Fix release guide heading level


------------------------------------------
[...truncated 268.96 KB...]
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0121100249"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 32235cfa3d985bb95a926324f62a2c6f)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:84)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.ExecutionException: org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
	at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Status.asRuntimeException(Status.java:524)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:275)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closedInternal(ServerCallImpl.java:353)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:341)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:867)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

root: ERROR: org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 48.298s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vu5pq7hepib6e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #220

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/220/display/redirect?page=changes>

Changes:

[kamil.wasilewski] Report status code 0 when no stale jobs are found


------------------------------------------
[...truncated 215.68 KB...]
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_1_0120105135"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: b8b64454b89572cf9485f10696ebc55a)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:84)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.lang.Exception: The user defined 'open()' method caused an exception: java.io.IOException: Received exit code 126 for command 'docker run -d --network=host --env=DOCKER_MAC_CONTAINER=null --rm gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --id=1-1 --logging_endpoint=localhost:42083 --artifact_endpoint=localhost:33957 --provision_endpoint=localhost:42023 --control_endpoint=localhost:40871'. stderr: docker: Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Post http://%2Fvar%2Frun%2Fdocker.sock/v1.38/containers/create: dial unix /var/run/docker.sock: connect: permission denied.See 'docker run --help'.
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:499)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.io.IOException: Received exit code 126 for command 'docker run -d --network=host --env=DOCKER_MAC_CONTAINER=null --rm gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --id=1-1 --logging_endpoint=localhost:42083 --artifact_endpoint=localhost:33957 --provision_endpoint=localhost:42023 --control_endpoint=localhost:40871'. stderr: docker: Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Post http://%2Fvar%2Frun%2Fdocker.sock/v1.38/containers/create: dial unix /var/run/docker.sock: connect: permission denied.See 'docker run --help'.
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4966)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:331)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:320)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:250)
	at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:38)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.open(FlinkExecutableStageFunction.java:137)
	at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:495)
	... 4 more
Caused by: java.io.IOException: Received exit code 126 for command 'docker run -d --network=host --env=DOCKER_MAC_CONTAINER=null --rm gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --id=1-1 --logging_endpoint=localhost:42083 --artifact_endpoint=localhost:33957 --provision_endpoint=localhost:42023 --control_endpoint=localhost:40871'. stderr: docker: Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Post http://%2Fvar%2Frun%2Fdocker.sock/v1.38/containers/create: dial unix /var/run/docker.sock: connect: permission denied.See 'docker run --help'.
	at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:234)
	at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:168)
	at org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:92)
	at org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:159)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:200)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:184)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
	... 12 more

root: ERROR: java.io.IOException: Received exit code 126 for command 'docker run -d --network=host --env=DOCKER_MAC_CONTAINER=null --rm gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --id=1-1 --logging_endpoint=localhost:42083 --artifact_endpoint=localhost:33957 --provision_endpoint=localhost:42023 --control_endpoint=localhost:40871'. stderr: docker: Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Post http://%2Fvar%2Frun%2Fdocker.sock/v1.38/containers/create: dial unix /var/run/docker.sock: connect: permission denied.See 'docker run --help'.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 19.685s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vwiqf4cx3lgfo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #219

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/219/display/redirect>

Changes:


------------------------------------------
[...truncated 158.92 KB...]
+ echo 'Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:35797'
Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:35797
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-219-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:35797 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-219'
3d6e67a7fb3599edb1c39b746d1ee408b688f67d60d342d56f09c7806450c58b
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-gbk-flink-batch-219-m '--command=curl -s "http://beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:35797/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1579439547254_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-d51bc4cf-0237-4bc5-b5a9-c50b88f4f199"},{"key":"jobmanager.rpc.port","value":"41117"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1579439547254_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:35797
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1579439547254_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-d51bc4cf-0237-4bc5-b5a9-c50b88f4f199"},{"key":"jobmanager.rpc.port","value":"41117"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1579439547254_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
","value":"beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=41117
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-219-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:35797 -L 41117:beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:41117 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-219-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:35797 -L 41117:beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:41117 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-219-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:35797 -L 41117:beam-loadtests-python-gbk-flink-batch-219-w-0.c.apache-beam-testing.internal:41117 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5194729264703001397.sh
+ echo src Load test: 2GB of 10B records src
src Load test: 2GB of 10B records src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey -Prunner=PortableRunner '-PloadTest.args=--job_name=load_tests_Python_Flink_Batch_GBK_1_0119100244 --publish_to_big_query=true --project=apache-beam-testing --metrics_dataset=load_test --metrics_table=python_flink_batch_GBK_1 --input_options='{"num_records": 200000000,"key_size": 1,"value_size":9}' --iterations=1 --fanout=1 --parallelism=5 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --environment_type=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:python:setupVirtualenv UP-TO-DATE
> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv UP-TO-DATE

> Task :sdks:python:sdist
setup.py:244: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.20.0.dev' to '2.20.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Processing <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/build/apache-beam.tar.gz>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.26.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.16.6)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.10.1)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.11.2)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.6.8)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.15.1)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.7.4.1)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.7.4.1)
Requirement already satisfied: cachetools<4,>=3.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (3.1.1)
Requirement already satisfied: google-apitools<0.5.29,>=0.5.28 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.5.28)
Requirement already satisfied: google-cloud-datastore<1.8.0,>=1.7.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.7.4)
Requirement already satisfied: google-cloud-pubsub<1.1.0,>=0.39.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.0.2)
Requirement already satisfied: google-cloud-bigquery<1.18.0,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.17.1)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.2.0)
Requirement already satisfied: google-cloud-bigtable<1.1.0,>=0.31.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.0.0)
Requirement already satisfied: google-cloud-spanner>=1.7.1<1.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.13.0)
Requirement already satisfied: grpcio-gcp<1,>=0.2.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.2.2)
Requirement already satisfied: googledatastore<7.1,>=7.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (7.0.2)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.90.4)
Requirement already satisfied: freezegun>=0.3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.3.13)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.8.0,>=0.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (0.7.1)
Requirement already satisfied: pyhamcrest!=1.10.0,<2.0.0,>=1.9 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.10.1)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (5.3)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (4.6.9)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.20.0.dev0) (1.31.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.20.0.dev0) (1.14.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.20.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.20.0.dev0) (5.4.4)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.20.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.20.0.dev0) (0.2.8)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.20.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.20.0.dev0) (44.0.0)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.20.0.dev0) (2.4.6)
Requirement already satisfied: fasteners>=0.14 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.20.0.dev0) (0.15)
Requirement already satisfied: google-api-core[grpc]<2.0.0dev,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.20.0.dev0) (1.16.0)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.20.0.dev0) (0.12.3)
Requirement already satisfied: google-resumable-media<0.5.0dev,>=0.3.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.20.0.dev0) (0.4.1)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.20.0.dev0) (1.51.0)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.20.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.3.0)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (20.0)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.1.8)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.4.0)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.8.1)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.13.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (19.3.0)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (5.0.0)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.7.1)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (1.25.7)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (2019.11.28)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.20.0.dev0) (2.8)
Requirement already satisfied: google-auth<2.0dev,>=0.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.20.0.dev0) (1.10.1)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.0.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.5)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.20.0.dev0-cp27-none-any.whl size=1873404 sha256=479910d860efbe0e4699e3b63f6c82a6b495863fb422d198f50ffe47477cb08d
  Stored in directory: /home/jenkins/.cache/pip/wheels/cc/83/a8/84eec4642678434c71d7c32582fda394dd6b649d42079a46b0
Successfully built apache-beam
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.20.0.dev0
    Uninstalling apache-beam-2.20.0.dev0:
      Successfully uninstalled apache-beam-2.20.0.dev0
Successfully installed apache-beam-2.20.0.dev0

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:244: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.20.0.dev' to '2.20.0.dev0'
  normalized_version,
running nosetests
running egg_info
Skipping proto regeneration: all files up to date
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... Terminated

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 19s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=de2804a7-d936-4340-88fb-3fb69ac578cf, currentDir=<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 12457
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-12457.out.log
----- Last  20 lines from daemon log file - daemon-12457.out.log -----
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 19s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Python_GBK_Flink_Batch - Build # 218 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #218)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/218/ to view the results.

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #217

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/217/display/redirect?page=changes>

Changes:

[robertwb] Automatically convert to with pipeline syntax.

[robertwb] Quick pass through failed auto-conversions.

[robertwb] Automatic conversion of more pipelines.

[robertwb] Fix lint and tests due to autoconversion.

[robertwb] A couple more conversions.

[robertwb] Fix lint and tests due to autoconversion.

[robertwb] Return non-None result for Dataflow dry run.

[robertwb] Fix lint and tests due to autoconversion.

[kamil.wasilewski] [BEAM-8939] A bash script that cancels stale dataflow jobs

[marek.simunek] [BEAM-9123] HadoopResourceId returns wrong directoryName bugfix

[mxm] [BEAM-9116] Limit the number of past invocations stored in JobService

[lukecwik] [BEAM-9124] Linkage Checker 1.1.2 to use Maven Central HTTPS URL

[robertwb] lint, reviewer comments

[relax] Merge pull request #10316: [BEAM-6857] Support Dynamic Timers

[robertwb] lint

[github] Merge pull request #10577 from Adding Python test for ReadFromBigQuery

[github] [BEAM-9127] Fix output type declaration in xlang wordcount. (#10605)

[robertwb] fix merge

[angoenka] [BEAM-8625] Implement servlet for exposing sdk harness statuses in Da…


------------------------------------------
[...truncated 274.49 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0117110238"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 5f9dddadcb6535c6d00959dca458884b)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:84)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1579266911804_0001_01_000009 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1579266911804_0001_01_000009 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 116.514s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 5s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/c76lcvgucqnow

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #216

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/216/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8525] Support Const base in binary_subscr

[ehudm] Do not perform test on Py2.7

[ankurgoenka] [BEAM-9002] Add test_flatten_same_pcollections to fnapi runner

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[valentyn] Add version guards to requirements file for integration tests.

[hannahjiang] [BEAM-9084] fix Java spotless

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security issues

[boyuanz] Moving to 2.20.0-SNAPSHOT on master branch.

[crites] Changes watermark advance from 1001 to 1000 since Dataflow TestStream

[lukecwik] [BEAM-8676] sdks/java: gax and grpc upgrades (#10554)

[lukecwik] [BEAM-9030] Migrate Beam to use beam-vendor-grpc-1_26_0 (#10578)

[lcwik] [BEAM-9030] Align version of protoc/protoc-gen-grpc-java to vendored

[boyuanz] Exclude testOutputTimestamp from flink streaming tests.

[lukecwik] [BEAM-7951] Supports multiple inputs/outputs for wire coder settings.


------------------------------------------
[...truncated 274.47 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0116110813"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: edc9558580e33fb63a6d26c3d14934df)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:84)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1579180340299_0001_01_000002 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1579180340299_0001_01_000002 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 116.362s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 5s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ftczukkmoxtp6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #215

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/215/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-6587] Remove hacks due to missing common string coder.

[kirillkozlov] Update data source for SQL performance tests

[chadrik] [BEAM-7746] Address changes in code since annotations were introduced

[chadrik] [BEAM-7746]  Typing fixes that require runtime code changes

[chadrik] [BEAM-7746] Avoid creating attributes dynamically, so that they can be

[chadrik] [BEAM-7746] Bugfix: coder id is expected to be str in python3

[chadrik] [BEAM-7746] Explicitly unpack tuple to avoid inferring unbounded tuple

[chadrik] [BEAM-7746] Generate files with protobuf urns as part of gen_protos

[chadrik] [BEAM-7746] Move name and coder to base StateSpec class

[chadrik] [BEAM-7746] Remove reference to missing attribute in

[chadrik] [BEAM-7746] Non-Optional arguments cannot default to None

[chadrik] [BEAM-7746] Avoid reusing variables with different data types

[chadrik] [BEAM-7746] Add StateHandler abstract base class

[chadrik] [BEAM-7746] Add TODO about fixing assignment to

[chadrik] [BEAM-7746] Fix functions that were defined twice

[chadrik] [BEAM-7746] Fix tests that have the same name

[iemejia] [BEAM-9040] Add skipQueries option to skip queries in a Nexmark suite

[iemejia] [BEAM-9040] Add Spark Structured Streaming Runner to Nexmark PostCommit

[valentyn] Switch to unittest.SkipTest instead of using nose.

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and

[chamikara] Sets the correct coder when clustering is enabled for the

[robertwb] Always initalize output processor on construction.

[github] [Go SDK Doc] Update Dead Container Link (#10585)

[github] Merge pull request #10582 for [INFRA-19670] Add .asf.yaml for Github


------------------------------------------
[...truncated 274.62 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0115100243"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 5499c081e0d62b74686ae1fc0fabba12)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:84)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.ExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
	at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status.asRuntimeException(Status.java:524)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:273)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:337)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:793)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

root: ERROR: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 48.191s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/txcn3nvp4o3cu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #214

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/214/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7390] Add code snippet for Min

[dcavazos] [BEAM-7390] Add code snippet for Sum

[ehudm] Light cleanup of opcodes.py

[apilloud] [BEAM-8630] Use column numbers for BeamZetaSqlCalRel

[pawel.pasterz] [BEAM-7115] Fix metrics being incorrectly gathered

[mxm] Remove incorrectly tagged test annotation from test case

[mxm] [BEAM-6008] Propagate errors during pipeline execution in Java's

[github] Tighten language and remove distracting link

[pabloem] [BEAM-7390] Add code snippet for Top (#10179)

[bhulette] [BEAM-8993] [SQL] MongoDB predicate push down. (#10417)

[lukecwik] [BEAM-8740] Remove unused dependency from Spark runner (#10564)

[github] [BEAM-5605] Add support for channel splitting to the gRPC read "source"

[github] [BEAM-5605] Add support for additional parameters to SplittableDofn


------------------------------------------
[...truncated 280.28 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0114115033"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 5c13e299fccf5b9acee19d4decb2d7bd)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:84)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1579007937934_0001_01_000003  timed out.
	at org.apache.flink.runtime.resourcemanager.ResourceManager$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(ResourceManager.java:1146)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1579007937934_0001_01_000003  timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 104.361s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 52s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mgzfuup5bek3s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #213

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/213/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-8956] Begin unifying contributor instructions into a single


------------------------------------------
[...truncated 153.53 KB...]
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "16"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: b4ae1a05ece403e5e48ab0eef6e89bd6)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: org.apache.flink.runtime.jobmanager.scheduler.NoResourceAvailableException: Could not allocate enough slots to run the job. Please make sure that the cluster has enough resources.
	at org.apache.flink.runtime.executiongraph.Execution.lambda$scheduleForExecution$0(Execution.java:460)
	at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
	at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
	at org.apache.flink.runtime.jobmaster.slotpool.SchedulerImpl.lambda$internalAllocateSlot$0(SchedulerImpl.java:190)
	at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
	at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
	at org.apache.flink.runtime.jobmaster.slotpool.SlotSharingManager$SingleTaskSlot.release(SlotSharingManager.java:700)
	at org.apache.flink.runtime.jobmaster.slotpool.SlotSharingManager$MultiTaskSlot.release(SlotSharingManager.java:484)
	at org.apache.flink.runtime.jobmaster.slotpool.SlotSharingManager$MultiTaskSlot.lambda$new$0(SlotSharingManager.java:380)
	at java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
	at java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:811)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
	at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
	at org.apache.flink.runtime.concurrent.FutureUtils$Timeout.run(FutureUtils.java:998)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: org.apache.flink.runtime.jobmanager.scheduler.NoResourceAvailableException: Could not allocate enough slots to run the job. Please make sure that the cluster has enough resources.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 309.273s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 41s
5 actionable tasks: 4 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/hp2ba3stvqh6m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #212

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/212/display/redirect>

Changes:


------------------------------------------
[...truncated 274.66 KB...]
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0112100243"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 473ea478f2100189ec42080d32240ad1)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.lang.Exception: The data preparation for task 'GroupReduce (GroupReduce at GroupByKey 0)' , caused an error: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-212-w-0.c.apache-beam-testing.internal/10.128.0.49:44601'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:480)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: java.lang.RuntimeException: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-212-w-0.c.apache-beam-testing.internal/10.128.0.49:44601'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger.getIterator(UnilateralSortMerger.java:650)
	at org.apache.flink.runtime.operators.BatchTask.getInput(BatchTask.java:1109)
	at org.apache.flink.runtime.operators.GroupReduceDriver.prepare(GroupReduceDriver.java:99)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:474)
	... 4 more
Caused by: java.io.IOException: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-212-w-0.c.apache-beam-testing.internal/10.128.0.49:44601'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger$ThreadBase.run(UnilateralSortMerger.java:831)
Caused by: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-212-w-0.c.apache-beam-testing.internal/10.128.0.49:44601'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.io.network.netty.CreditBasedPartitionRequestClientHandler.channelInactive(CreditBasedPartitionRequestClientHandler.java:136)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:390)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:355)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1429)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:947)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:826)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
	at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:474)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909)
	at java.lang.Thread.run(Thread.java:748)

root: ERROR: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-212-w-0.c.apache-beam-testing.internal/10.128.0.49:44601'. This might indicate that the remote task manager was lost.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 24.133s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 32s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ftnl74nsty2ik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #211

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/211/display/redirect?page=changes>

Changes:

[valentyn] Install SDK after tarball is generated to avoid a race in proto stubs

[ehudm] junitxml_report: Add failure tag support

[jeff] BEAM-8745 More fine-grained controls for the size of a BigQuery Load job

[suztomo] google_auth_version 0.19.0

[kcweaver] [BEAM-9070] tests use absolute paths for job server jars

[apilloud] [BEAM-9027] Unparse DOY/DOW/WEEK Enums properly for ZetaSQL

[kcweaver] [BEAM-8337] Hard-code Flink versions.

[echauchot] [BEAM-9019] Remove BeamCoderWrapper to avoid extra object allocation and

[lukecwik] [BEAM-8624] Implement Worker Status FnService in Dataflow runner

[github] [BEAM-5605] Add support for executing pair with restriction, split

[kcweaver] fix indentation

[kcweaver] Update release guide

[lostluck] [BEAM-9080] Support KVs in the Go SDK's Partition

[github] Rephrasing lull logging to avoid alarming users (#10446)

[robertwb] [BEAM-8575] Added counter tests for CombineFn (#10190)

[github] [BEAM-8490] Fix instance_to_type for empty containers (#9894)

[apilloud] [BEAM-9027] Backport BigQuerySqlDialect fixes

[robertwb] [BEAM-8575] Test hot-key fanout with accumulation modes. (#10159)

[github] [BEAM-9059] Use string constants in PTransformTranslation instead of


------------------------------------------
[...truncated 274.21 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0111100245"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: e87fbc02178af84fed0d295fca910f84)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.ExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
	at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status.asRuntimeException(Status.java:524)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:273)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:337)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:793)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

root: ERROR: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 60.205s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 9s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/udykecg74ihnq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #210

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/210/display/redirect?page=changes>

Changes:

[kirillkozlov] Add a new Jenkins job for SQL perf tests

[kirillkozlov] Test boilerplate

[kirillkozlov] Table proxy to add TimeMonitor after the IO

[kirillkozlov] Tests for direct_read w/o push-down and default methods

[kirillkozlov] Cleanup

[kirillkozlov] Monitor total number of fields read from an IO

[echauchot] Fix link in javadoc to accumulators

[iemejia] [BEAM-8716] Update commons-csv to version 1.7

[ehudm] Small fixes to verify_release_build.sh

[kirillkozlov] Metric name should not be constant

[33895511+aromanenko-dev] [BEAM-8953] Extend ParquetIO read builders for AvroParquetReader

[brad.g.west] [BEAM-9078] Pass total_size to storage.Upload

[hannahjiang] BEAM-7861 add direct_running_mode option

[github] [BEAM-9075] Disable JoinCommuteRule for ZetaSQL planner (#10542)

[bhulette] [BEAM-9075] add a test case. (#10545)

[12602502+Ardagan] [BEAM-8821] Document Python SDK 2.17.0 deps (#10212)

[kirillkozlov] Missing commit

[hannahjiang] [BEAM-7861] rephrase direct_running_mode option checking


------------------------------------------
[...truncated 267.52 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0110100254"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: eb682aacc82e6c91406095af4da43413)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1578661845906_0001_01_000005  timed out.
	at org.apache.flink.runtime.resourcemanager.ResourceManager$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(ResourceManager.java:1146)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1578661845906_0001_01_000005  timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 84.240s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 28s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/ftdokuemn2yqm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #209

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/209/display/redirect?page=changes>

Changes:

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes from

[mzobii.baig] Beam-2535 : Replace timeStamp with outputTimeStamp

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] Beam-2535 : Pass outputTimestamp param in onTimer method

[mzobii.baig] Beam-2535 : Minor changed

[rehman.muradali] [BEAM-2535] : Add Commit State in ParDoEvaluator

[rehman.muradali] [BEAM-2535] : Add outputTimestamp in compare method, Revert

[mzobii.baig] Beam-2535 : Modifying default minimum target and GC time

[rehman.muradali] BEAM-2535 : Removal of extra lines

[mzobii.baig] Beam-2535 : Proposed changes

[mzobii.baig] Beam-2535 : Added original PR watermark hold functionality.

[rehman.muradali] [BEAM-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Variable renaming and added output timestamp in

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] [Beam-2535] Modify test case

[mzobii.baig] [Beam-2535] Added comments

[mzobii.baig] [Beam-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Set Processing Time with outputTimestamp

[mzobii.baig] [Beam-2535] Minor renaming

[rehman.muradali] [BEAM-2535] Revert Processing Time, Addition of OutputTimestamp

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[rehman.muradali] Adding OutputTimestamp in Timer Object

[rehman.muradali] Apply Spotless and checkstyle

[mzobii.baig] [Beam-2535] Added watermark functionality for the dataflow runner

[mzobii.baig] [Beam-2535] Used boolean instead boxed type

[mzobii.baig] [Beam-2535] Modify required watermark hold functionality

[rehman.muradali] EarliestTimestamp Fix for outputTimestamp

[sunjincheng121] [BEAM-9030] Bump grpc to 1.26.0

[sunjincheng121] [BEAM-9030] Update the dependencies to make sure the dependency linkage

[rehman.muradali] Rebase TimerData PR

[sunjincheng121] fixup

[mxm] Rename FlinkClassloading to Workarounds

[mxm] [BEAM-9060] Restore stdout/stderr in case Flink's

[iemejia] [BEAM-8717] Update commons-lang3 to version 3.9

[iemejia] [BEAM-8717] Make non core modules use only the repackaged commons-lang3

[sunjincheng121] fixup

[github] Update ParDoTest.java

[rehman.muradali] Apply spotless

[rehman.muradali] Compilation Fix PardoTest

[rehman.muradali] Reverting outputTimestamp validation

[rehman.muradali] CheckStyle Fix

[rehman.muradali] Adding Category to exclude Flink Runner

[jkai] [BEAM-8496] remove SDF translators in flink streaming transform

[github] Fix blogpost typo (#10532)

[12602502+Ardagan] Fix headings in downloads.md

[github] Add # pytype: skip-file before first import statement in each py file


------------------------------------------
[...truncated 267.38 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0109110031"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 408aa49e71690b992a1eef9064037a41)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578575615166_0001_01_000002 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578575615166_0001_01_000002 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 144.297s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 29s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/ygzmbsh4cmaik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #208

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/208/display/redirect?page=changes>

Changes:

[rehman.muradali] onTimer/setTimer signature updates

[lcwik] [BEAM-9059] Migrate PTransformTranslation to use string constants

[iemejia] [BEAM-8701] Remove unused commons-io_1x dependency

[iemejia] [BEAM-8701] Update commons-io to version 2.6

[github] Restrict the upper bound for pyhamcrest, since new version does not work

[apilloud] [BEAM-9027] [SQL] Fix ZetaSQL Byte Literals

[github] [BEAM-9058] Fix line-too-long exclusion regex and re-enable

[altay] Readability/Lint fixes

[hannahjiang] BEAM-8780 reuse RC images instead of recreate images

[iemejia] [BEAM-9041] Add missing equals methods for GenericRecord <-> Row

[iemejia] [BEAM-9042] Fix RowToGenericRecordFn Avro schema serialization

[iemejia] [BEAM-9042] Update SchemaCoder doc with info about functions requiring

[iemejia] [BEAM-9042] Test serializability and equality of Row<->GenericRecord

[tvalentyn] [BEAM-9062] Improve assertion error for equal_to (#10504)

[chamikara] [BEAM-8960]: Add an option for user to opt out of using insert id for

[36090911+boyuanzz] [BEAM-8932] [BEAM-9036] Revert reverted commit to use PubsubMessage as


------------------------------------------
[...truncated 267.31 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0108100256"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: ff25a7ac4c1a04c66f5890656f88df2a)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578489733294_0001_01_000005 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578489733294_0001_01_000005 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 64.025s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 8s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/2x4thv7su43vk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #207

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/207/display/redirect?page=changes>

Changes:

[kirillkozlov] Modify AggregateProjectMergeRule to have a condition

[kirillkozlov] SpotlesApply

[kirillkozlov] Test for a query with a predicate

[kirillkozlov] A list of visited nodes should be unique per onMatch invocation

[kirillkozlov] Make sure all nodes are explored

[mikhail] Update release docs

[mikhail] Blogpost stub

[mikhail] Add blogpost file

[mikhail] Add blogpost highlights

[github] Update release notes version to correct one.

[kirillkozlov] Fix BytesValue unparsing

[kirillkozlov] Fix floating point literals

[kirillkozlov] Fix string literals

[kirillkozlov] Add null check for SqlTypeFamily

[kirillkozlov] ZetaSqlCalcRule should be disaled by defualt

[kirillkozlov] spotles

[suztomo] protobuf 3.11.1

[kirillkozlov] Address comments

[sunjincheng121] [BEAM-9055] Unify the config names of Fn Data API across languages.

[davidsabater] [BEAM-9053] Improve error message when unable to get the correct

[mxm] [BEAM-8577] Initialize FileSystems during Coder deserialization in

[github] Update _posts_2019-12-16-beam-2.17.0.md

[github] Cleanup formatting.

[github] Update release date.

[iemejia] [BEAM-5546] Update commons-codec to version 1.14

[iemejia] [BEAM-5544] Update cassandra-all dependency to version 3.11.5

[iemejia] [BEAM-8749] Update cassandra-driver-mapping to version 3.8.0


------------------------------------------
[...truncated 267.52 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0107100255"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 50d78e6705725a188ea98d9dc9b4388a)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578402941351_0001_01_000005 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578402941351_0001_01_000005 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 86.203s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 31s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/y3e66wy4d5s7a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #206

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/206/display/redirect>

Changes:


------------------------------------------
[...truncated 125.35 KB...]
> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv UP-TO-DATE

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.26.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.16.6)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.10.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.11.2)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.6.8)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.15.1)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.7.4.1)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.7.4.1)
Requirement already satisfied: cachetools<4,>=3.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.1.1)
Requirement already satisfied: google-apitools<0.5.29,>=0.5.28 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.5.28)
Requirement already satisfied: google-cloud-datastore<1.8.0,>=1.7.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7.4)
Requirement already satisfied: google-cloud-pubsub<1.1.0,>=0.39.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.2)
Requirement already satisfied: google-cloud-bigquery<1.18.0,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.17.1)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.1.0)
Requirement already satisfied: google-cloud-bigtable<1.1.0,>=0.31.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.0)
Requirement already satisfied: googledatastore<7.1,>=7.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (7.0.2)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.90.4)
Requirement already satisfied: freezegun>=0.3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.3.12)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.8.0,>=0.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.7.1)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.9.0)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (5.2)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (4.6.9)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.31.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.19.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.19.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.19.0.dev0) (5.4.4)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (0.2.7)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.19.0.dev0) (44.0.0)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.19.0.dev0) (2.4.6)
Requirement already satisfied: fasteners>=0.14 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.19.0.dev0) (0.15)
Requirement already satisfied: google-api-core[grpc]<2.0.0dev,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.19.0.dev0) (1.15.0)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.19.0.dev0) (0.12.3)
Requirement already satisfied: google-resumable-media<0.5.0dev,>=0.3.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.19.0.dev0) (0.4.1)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.19.0.dev0) (1.6.0)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.19.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.3.0)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (20.0)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.1.8)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.3.0)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.8.1)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.13.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (19.3.0)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (5.0.0)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.7.1)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (1.25.7)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2019.11.28)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2.8)
Requirement already satisfied: google-auth<2.0dev,>=0.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.19.0.dev0) (1.10.0)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.5)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.19.0.dev0
    Not uninstalling apache-beam at <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python,> outside environment <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:237: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.19.0.dev' to '2.19.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 223.129s

OK

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 3m 51s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/k5fhw2c4uvdjy

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins708083698236655376.sh
+ echo Changing number of workers to 5
Changing number of workers to 5
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
FLINK_NUM_WORKERS=5

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5932296109692179242.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh restart
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-206-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ restart
+ delete
+ gcloud dataproc clusters delete beam-loadtests-python-gbk-flink-batch-206 --region=global --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/9340d202-fa26-3b6c-a92a-e3c5e834c01b].
Waiting for cluster deletion operation...
............................................................done.
Deleted [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-206].
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                -
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-206 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/8c8a6bce-037b-339f-8220-3dbc30a56615].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
..............................................................................................................................................................................................................................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-206] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-206-m '--command=yarn application -list'
++ grep beam-loadtests-python-gbk-flink-batch-206
Warning: Permanently added 'compute.7829732761521786103' (ECDSA) to the list of known hosts.
20/01/06 13:18:16 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-gbk-flink-batch-206-m/10.128.0.53:8032
+ read line
+ echo

++ echo
++ sed 's/ .*//'
+ application_ids[$i]=
++ echo
++ sed 's/.*beam-loadtests-python-gbk-flink-batch-206/beam-loadtests-python-gbk-flink-batch-206/'
++ sed 's/ .*//'
+ application_masters[$i]=
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=
+ echo 'Using Yarn Application master: '
Using Yarn Application master: 
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-206-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master= --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-206'
a6233943850984c35c4de99a96dd0a84ee915e25302a6ac70a8a4846fa7aad8e
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-gbk-flink-batch-206-m '--command=curl -s "http:///jobmanager/config"'
+ local job_server_config=
+ local key=jobmanager.rpc.port
++ echo
++ cut -d : -f1
+ local yarn_application_master_host=
++ echo
++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python2.7/json/__init__.py", line 291, in load
    **kw)
  File "/usr/lib/python2.7/json/__init__.py", line 339, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
+ local jobmanager_rpc_port=
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-206-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-206-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-206-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #205

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/205/display/redirect>

Changes:


------------------------------------------
[...truncated 267.29 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0105100241"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 0644535f1d97a82e811d1fd97d7cb1b2)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578229796619_0001_01_000003 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578229796619_0001_01_000003 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 64.122s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 8s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/fesmaxc4ew5lo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #204

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/204/display/redirect?page=changes>

Changes:

[kcweaver] Import freezegun for Python time testing.

[kcweaver] Allow message stream to yield duplicates.

[kcweaver] [BEAM-8891] Create and submit Spark portable jar in Python.

[kawaigin] [BEAM-8977] Resolve test flakiness

[kcweaver] Refactor shared uber jar generation code into common subclass.

[kamil.wasilewski] [BEAM-8671] Fix Python 3.7 ParDo test job name

[kcweaver] Make Spark REST URL a separate pipeline option.

[aaltay] [BEAM-8335] On Unbounded Source change (#10442)

[aaltay] [BEAM-9013] TestStream fix for DataflowRunner (#10445)

[angoenka] [BEAM-8575] Refactor test_do_fn_with_windowing_in_finish_bundle to work


------------------------------------------
[...truncated 267.27 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0104100240"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 658e8e25c98b33de74ae06781cbfff1d)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.ExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
	at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status.asRuntimeException(Status.java:524)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:273)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:337)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:793)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

root: ERROR: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 52.109s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/kzdnaniv2sjdq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #203

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/203/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-8935] Fail fast if sdk harness startup failed.

[heejong] [BEAM-9034] Update environment_id for ExternalTransform in Python SDK

[github] Catch __module__ is None.

[github] [BEAM-5600] Add unimplemented split API to Runner side SDF libraries.

[github] [BEAM-5605] Fix type used to describe channel splits to match type used

[github] [BEAM-5605] Ensure that split calls are routed to the active bundle


------------------------------------------
[...truncated 266.21 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0103100251"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 8131662a25e44bfee5938b12a48028be)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578057186583_0001_01_000003 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1578057186583_0001_01_000003 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 64.237s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 8s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/a5z2jnekkbqj6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #202

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/202/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-9006] Improve ProcessManager for shutdown hook handling.


------------------------------------------
[...truncated 266.04 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0102100239"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 214ebbc9160a7138c39c340f9d9a167f)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1577970600774_0001_01_000004  timed out.
	at org.apache.flink.runtime.resourcemanager.ResourceManager$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(ResourceManager.java:1146)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1577970600774_0001_01_000004  timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 92.166s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 36s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/ivhnuh2txymdu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #201

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/201/display/redirect>

Changes:


------------------------------------------
[...truncated 266.04 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0101114453"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 599501c73b655ad168dd3fbc2120c59e)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1577884216584_0001_01_000006 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1577884216584_0001_01_000006 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 63.996s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 8s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/jftsumxshk6re

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #200

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/200/display/redirect?page=changes>

Changes:

[github] Python example parameters fix

[udim] [BEAM-9012] Change __init__ hints so they work with pytype (#10466)

[github] [BEAM-9039] Fix race on reading channel readErr. (#10456)

[lcwik] [BEAM-5605] Increase precision of fraction used during splitting.

[github] [BEAM-8487] Convert forward references to Any (#9888)

[lukecwik] [BEAM-9020] LengthPrefixUnknownCodersTest to avoid relying on

[lukecwik] [BEAM-7951] Improve the docs for beam_runner_api.proto and


------------------------------------------
[...truncated 266.20 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1231100251"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 0d15259884a81a74620b5aee9af7536d)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1577797787204_0001_01_000002 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1577797787204_0001_01_000002 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 64.066s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 8s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/4vxpf7thmcrtm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #199

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/199/display/redirect>

Changes:


------------------------------------------
[...truncated 266.81 KB...]
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: fe58cb0a5fc7145355d6020c92eebf02)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.lang.IllegalStateException: Update to task [GroupReduce (GroupReduce at GroupByKey 0) (1/5) - execution #0] on TaskManager container_e01_1577719446517_0001_01_000002 @ beam-loadtests-python-gbk-flink-batch-199-w-0.c.apache-beam-testing.internal (dataPort=43867) failed
	at org.apache.flink.runtime.executiongraph.Execution.lambda$sendUpdatePartitionInfoRpcCall$14(Execution.java:1395)
	at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
	at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.concurrent.CompletionException: akka.pattern.AskTimeoutException: Ask timed out on [Actor[akka.tcp://flink@beam-loadtests-python-gbk-flink-batch-199-w-0.c.apache-beam-testing.internal:39803/user/taskmanager_0#1422972667]] after [10000 ms]. Message of type [org.apache.flink.runtime.rpc.messages.RemoteRpcInvocation]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply.
	at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:292)
	at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:308)
	at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:607)
	at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
	at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
	at org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:871)
	at akka.dispatch.OnComplete.internal(Future.scala:263)
	at akka.dispatch.OnComplete.internal(Future.scala:261)
	at akka.dispatch.japi$CallbackBridge.apply(Future.scala:191)
	at akka.dispatch.japi$CallbackBridge.apply(Future.scala:188)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
	at org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:74)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252)
	at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:644)
	at akka.actor.Scheduler$$anon$4.run(Scheduler.scala:205)
	at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601)
	at scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:109)
	at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599)
	at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:328)
	at akka.actor.LightArrayRevolverScheduler$$anon$4.executeBucket$1(LightArrayRevolverScheduler.scala:279)
	at akka.actor.LightArrayRevolverScheduler$$anon$4.nextTick(LightArrayRevolverScheduler.scala:283)
	at akka.actor.LightArrayRevolverScheduler$$anon$4.run(LightArrayRevolverScheduler.scala:235)
	at java.lang.Thread.run(Thread.java:748)
Caused by: akka.pattern.AskTimeoutException: Ask timed out on [Actor[akka.tcp://flink@beam-loadtests-python-gbk-flink-batch-199-w-0.c.apache-beam-testing.internal:39803/user/taskmanager_0#1422972667]] after [10000 ms]. Message of type [org.apache.flink.runtime.rpc.messages.RemoteRpcInvocation]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply.
	at akka.pattern.PromiseActorRef$$anonfun$2.apply(AskSupport.scala:635)
	at akka.pattern.PromiseActorRef$$anonfun$2.apply(AskSupport.scala:635)
	at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:648)
	... 9 more

root: ERROR: akka.pattern.AskTimeoutException: Ask timed out on [Actor[akka.tcp://flink@beam-loadtests-python-gbk-flink-batch-199-w-0.c.apache-beam-testing.internal:39803/user/taskmanager_0#1422972667]] after [10000 ms]. Message of type [org.apache.flink.runtime.rpc.messages.RemoteRpcInvocation]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 54.298s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/5gwi5n6un3ztc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #198

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/198/display/redirect?page=changes>

Changes:

[relax] Merge pull request #10422: [BEAM-2535] TimerData signature update


------------------------------------------
[...truncated 266.22 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1229101312"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 1ce8602dcc30810c0acc047334416dfd)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.ExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:310)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
	at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status.asRuntimeException(Status.java:524)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:273)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:337)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:793)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

root: ERROR: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 58.222s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 2s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/d55ndxh6cmfsk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #197

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/197/display/redirect?page=changes>

Changes:

[zyichi] [BEAM-8824] Add support to allow specify window allowed_lateness in

[ehudm] Set TMPDIR for tox environments

[mxm] [BEAM-8962] Report Flink metric accumulator only when pipeline ends

[github] Revert "[BEAM-8932]  Modify PubsubClient to use the proto message


------------------------------------------
[...truncated 266.32 KB...]
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1228101140"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 0590fadaf2c6a05ca7de2c3083269c65)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.lang.Exception: The data preparation for task 'GroupReduce (GroupReduce at GroupByKey 0)' , caused an error: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-197-w-1.c.apache-beam-testing.internal/10.128.0.75:39655'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:480)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: java.lang.RuntimeException: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-197-w-1.c.apache-beam-testing.internal/10.128.0.75:39655'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger.getIterator(UnilateralSortMerger.java:650)
	at org.apache.flink.runtime.operators.BatchTask.getInput(BatchTask.java:1109)
	at org.apache.flink.runtime.operators.GroupReduceDriver.prepare(GroupReduceDriver.java:99)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:474)
	... 4 more
Caused by: java.io.IOException: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-197-w-1.c.apache-beam-testing.internal/10.128.0.75:39655'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger$ThreadBase.run(UnilateralSortMerger.java:831)
Caused by: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-197-w-1.c.apache-beam-testing.internal/10.128.0.75:39655'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.io.network.netty.CreditBasedPartitionRequestClientHandler.channelInactive(CreditBasedPartitionRequestClientHandler.java:136)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:390)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:355)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1429)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:947)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:826)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
	at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:474)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909)
	at java.lang.Thread.run(Thread.java:748)

root: ERROR: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-197-w-1.c.apache-beam-testing.internal/10.128.0.75:39655'. This might indicate that the remote task manager was lost.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 28.116s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 32s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/5esqtjicqn4k2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Python_GBK_Flink_Batch - Build # 196 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #196)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/196/ to view the results.

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #195

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/195/display/redirect?page=changes>

Changes:

[github] Merge pull request #10449: [BEAM-7274] Implement the Protobuf schema


------------------------------------------
[...truncated 266.17 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1226100541"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 4765c7e4ad3d8c0280584b7357fc2f79)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1577369957034_0001_01_000002  timed out.
	at org.apache.flink.runtime.resourcemanager.ResourceManager$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(ResourceManager.java:1146)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: The heartbeat of TaskManager with id container_e01_1577369957034_0001_01_000002  timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 78.353s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 22s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/oqcdbta5j7rj4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #194

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/194/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-5192] Migrate ElasticsearchIO to v7

[echauchot] [BEAM-5192] Minor change of ESIO public configuration API:

[echauchot] [BEAM-5192] Fix missing ifs for ES7 specificities.

[echauchot] [BEAM-5192] Remove unneeded transitive dependencies, upgrade ES and

[echauchot] [BEAM-5192] Disable MockHttpTransport plugin to enabe http dialog to

[echauchot] [BEAM-5192] Fix util class, elasticsearch changed their json output of

[echauchot] [BEAM-5192] Set a custom json serializer for document metadata to be

[echauchot] [BEAM-5192] Remove testWritePartialUpdateWithErrors because triggering

[sunjincheng121] [BEAM-7949] Add time-based cache threshold support in the data service

[sunjincheng121] [BEAM-7949] Introduce PeriodicThread for time-based cache threshold

[echauchot] [BEAM-5192] use <= and >= in version specific code instead of == to be

[relax] Merge pull request #10444: [BEAM-9010] Proper TableRow size calculation


------------------------------------------
[...truncated 266.11 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1225100328"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 5c3b9f83d9934978513be4e68c0d6b80)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1577280463212_0001_01_000004 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1577280463212_0001_01_000004 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 134.252s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 18s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/mbcvcdngpy77y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Python_GBK_Flink_Batch - Build # 193 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #193)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/193/ to view the results.

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #192

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/192/display/redirect>

Changes:


------------------------------------------
[...truncated 266.44 KB...]
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1223100308"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 20c4871a57427fde37dbb3ff287d0317)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.lang.Exception: The data preparation for task 'GroupReduce (GroupReduce at GroupByKey 0)' , caused an error: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-192-w-2.c.apache-beam-testing.internal/10.128.0.30:35007'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:480)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: java.lang.RuntimeException: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-192-w-2.c.apache-beam-testing.internal/10.128.0.30:35007'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger.getIterator(UnilateralSortMerger.java:650)
	at org.apache.flink.runtime.operators.BatchTask.getInput(BatchTask.java:1109)
	at org.apache.flink.runtime.operators.GroupReduceDriver.prepare(GroupReduceDriver.java:99)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:474)
	... 4 more
Caused by: java.io.IOException: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-192-w-2.c.apache-beam-testing.internal/10.128.0.30:35007'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger$ThreadBase.run(UnilateralSortMerger.java:831)
Caused by: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-192-w-2.c.apache-beam-testing.internal/10.128.0.30:35007'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.io.network.netty.CreditBasedPartitionRequestClientHandler.channelInactive(CreditBasedPartitionRequestClientHandler.java:136)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:390)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:355)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1429)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:947)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:826)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
	at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:474)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909)
	at java.lang.Thread.run(Thread.java:748)

root: ERROR: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-192-w-2.c.apache-beam-testing.internal/10.128.0.30:35007'. This might indicate that the remote task manager was lost.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 56.430s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 0s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/sv6r6y2uute46

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #191

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/191/display/redirect?page=changes>

Changes:

[github] Merge pull request #10356: [BEAM-7274] Infer a Beam Schema from a


------------------------------------------
[...truncated 152.90 KB...]
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-191-m '--command=yarn application -list'
++ grep beam-loadtests-python-gbk-flink-batch-191
Warning: Permanently added 'compute.4248156419579368494' (ECDSA) to the list of known hosts.
19/12/22 13:11:05 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-gbk-flink-batch-191-m/10.128.0.114:8032
+ read line
+ echo application_1577020187185_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 0% http://beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687
application_1577020187185_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 0% http://beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687
++ echo application_1577020187185_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 0% http://beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687
++ sed 's/ .*//'
+ application_ids[$i]=application_1577020187185_0001
++ echo application_1577020187185_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 0% http://beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687
++ sed 's/.*beam-loadtests-python-gbk-flink-batch-191/beam-loadtests-python-gbk-flink-batch-191/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687
+ echo 'Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687'
Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-191-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-191'
786c5c968f517be8eeb7c109245943e2e153ef87ec381112f0cafa59b23345df
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-gbk-flink-batch-191-m '--command=curl -s "http://beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1577020187185_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-1b6d5c1d-c5fb-4182-8352-61ca61bf551a"},{"key":"jobmanager.rpc.port","value":"40991"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1577020187185_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal
++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1577020187185_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-1b6d5c1d-c5fb-4182-8352-61ca61bf551a"},{"key":"jobmanager.rpc.port","value":"40991"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1577020187185_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=40991
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-191-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687 -L 40991:beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:40991 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-191-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687 -L 40991:beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:40991 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-191-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:37687 -L 40991:beam-loadtests-python-gbk-flink-batch-191-w-4.c.apache-beam-testing.internal:40991 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins446342973168931005.sh
+ echo src Load test: 2GB of 10B records src
src Load test: 2GB of 10B records src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey -Prunner=PortableRunner '-PloadTest.args=--job_name=load_tests_Python_Flink_Batch_GBK_1_1222100308 --publish_to_big_query=true --project=apache-beam-testing --metrics_dataset=load_test --metrics_table=python_flink_batch_GBK_1 --input_options='{"num_records": 200000000,"key_size": 1,"value_size":9}' --iterations=1 --fanout=1 --parallelism=5 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --environment_type=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv UP-TO-DATE

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.26.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.16.5)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.10.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.11.2)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.6.8)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.15.1)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.7.4.1)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.7.4.1)
Requirement already satisfied: cachetools<4,>=3.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.1.1)
Requirement already satisfied: google-apitools<0.5.29,>=0.5.28 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.5.28)
Requirement already satisfied: google-cloud-datastore<1.8.0,>=1.7.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7.4)
Requirement already satisfied: google-cloud-pubsub<1.1.0,>=0.39.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.2)
Requirement already satisfied: google-cloud-bigquery<1.18.0,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.17.1)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.1.0)
Requirement already satisfied: google-cloud-bigtable<1.1.0,>=0.31.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.0)
Requirement already satisfied: googledatastore<7.1,>=7.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (7.0.2)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.90.4)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.8.0,>=0.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.7.1)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.9.0)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (5.2)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (4.6.8)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.31.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.19.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.19.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.19.0.dev0) (5.4.4)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (0.2.7)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.19.0.dev0) (42.0.2)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.19.0.dev0) (2.4.5)
Requirement already satisfied: fasteners>=0.14 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.19.0.dev0) (0.15)
Requirement already satisfied: google-api-core[grpc]<2.0.0dev,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.19.0.dev0) (1.15.0)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.19.0.dev0) (0.12.3)
Requirement already satisfied: google-resumable-media<0.5.0dev,>=0.3.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.19.0.dev0) (0.4.1)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.19.0.dev0) (1.6.0)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.19.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.3.0)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (19.2)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.1.7)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.3.0)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.8.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.13.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (19.3.0)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (5.0.0)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.7.1)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (1.25.7)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2019.11.28)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2.8)
Requirement already satisfied: google-auth<2.0dev,>=0.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.19.0.dev0) (1.10.0)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.5)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.19.0.dev0
    Not uninstalling apache-beam at <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python,> outside environment <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:236: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.19.0.dev' to '2.19.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... Terminated

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 22m 55s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=945cd940-7fb3-4432-8d57-5ef70d38d318, currentDir=<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 15860
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-15860.out.log
----- Last  20 lines from daemon log file - daemon-15860.out.log -----
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 22m 55s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #190

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/190/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-8671] Add Python 3.7 support for LoadTestBuilder

[kamil.wasilewski] [BEAM-8671] Add ParDo test running on Python 3.7

[tysonjh] CachingShuffleBatchReader use bytes to limit size.

[valentyn] Sickbay VR tests that don't pass

[chamikara] Setting environment ID for ParDo and Combine transforms

[mxm] [BEAM-8996] Improvements to the Flink runner page

[ehudm] Upgrade parameterized version to 0.7.0+

[lukecwik] [BEAM-9004] Migrate org.mockito.Matchers#anyString to

[chamikara] Fixes Go formatting.

[robertwb] [BEAM-8335] Add a TestStreamService Python Implementation (#10120)

[lukecwik] Minor cleanup of tests using TestStream. (#10188)

[pabloem] [BEAM-2572] Python SDK S3 Filesystem (#9955)

[github] [BEAM-8974] Wait for log messages to be processed before checking them.


------------------------------------------
[...truncated 124.91 KB...]
Configuration on demand is an incubating feature.
> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv UP-TO-DATE

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.26.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.16.5)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.10.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.11.2)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.6.8)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.15.1)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.7.4.1)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.7.4.1)
Requirement already satisfied: cachetools<4,>=3.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.1.1)
Requirement already satisfied: google-apitools<0.5.29,>=0.5.28 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.5.28)
Requirement already satisfied: google-cloud-datastore<1.8.0,>=1.7.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7.4)
Requirement already satisfied: google-cloud-pubsub<1.1.0,>=0.39.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.2)
Requirement already satisfied: google-cloud-bigquery<1.18.0,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.17.1)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.1.0)
Requirement already satisfied: google-cloud-bigtable<1.1.0,>=0.31.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.0)
Requirement already satisfied: googledatastore<7.1,>=7.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (7.0.2)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.90.4)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.8.0,>=0.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.7.1)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.9.0)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (5.2)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (4.6.8)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.31.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.19.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.19.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.19.0.dev0) (5.4.4)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (0.2.7)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.19.0.dev0) (42.0.2)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.19.0.dev0) (2.4.5)
Requirement already satisfied: fasteners>=0.14 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.19.0.dev0) (0.15)
Requirement already satisfied: google-api-core[grpc]<2.0.0dev,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.19.0.dev0) (1.15.0)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.19.0.dev0) (0.12.3)
Requirement already satisfied: google-resumable-media<0.5.0dev,>=0.3.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.19.0.dev0) (0.4.1)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.19.0.dev0) (1.6.0)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.19.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.3.0)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (19.2)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.1.7)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.3.0)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.8.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.13.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (19.3.0)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (5.0.0)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.7.1)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (1.25.7)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2019.11.28)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2.8)
Requirement already satisfied: google-auth<2.0dev,>=0.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.19.0.dev0) (1.10.0)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.5)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.19.0.dev0
    Not uninstalling apache-beam at <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python,> outside environment <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:236: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.19.0.dev' to '2.19.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 208.346s

OK

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 3m 35s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/lwhaktgujzxh4

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6757572478933334788.sh
+ echo Changing number of workers to 5
Changing number of workers to 5
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
FLINK_NUM_WORKERS=5

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8393484126683978163.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh restart
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-190-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ restart
+ delete
+ gcloud dataproc clusters delete beam-loadtests-python-gbk-flink-batch-190 --region=global --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/f392dfe0-1e09-3e30-bcd0-566f58948ffa].
Waiting for cluster deletion operation...
.......................................................done.
Deleted [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-190].
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                -
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-190 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/fdd17d4e-84be-30b4-b7f8-6db7a25398d6].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
.....................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-190] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-190-m '--command=yarn application -list'
++ grep beam-loadtests-python-gbk-flink-batch-190
Warning: Permanently added 'compute.937606799032559409' (ECDSA) to the list of known hosts.
19/12/21 13:13:57 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-gbk-flink-batch-190-m/10.128.0.8:8032
+ read line
+ echo

++ echo
++ sed 's/ .*//'
+ application_ids[$i]=
++ echo
++ sed 's/.*beam-loadtests-python-gbk-flink-batch-190/beam-loadtests-python-gbk-flink-batch-190/'
++ sed 's/ .*//'
+ application_masters[$i]=
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=
+ echo 'Using Yarn Application master: '
Using Yarn Application master: 
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-190-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master= --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-190'
d038c5196f852557b18af6654509e5b0db3ddbb9f421fe52ea2d3a3574ad7f4c
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-gbk-flink-batch-190-m '--command=curl -s "http:///jobmanager/config"'
+ local job_server_config=
+ local key=jobmanager.rpc.port
++ echo
++ cut -d : -f1
+ local yarn_application_master_host=
++ echo
++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python2.7/json/__init__.py", line 291, in load
    **kw)
  File "/usr/lib/python2.7/json/__init__.py", line 339, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
+ local jobmanager_rpc_port=
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-190-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-190-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-190-m -- -L 8081: -L :: -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #189

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/189/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8269] Convert from_callable type hints to Beam types

[ehudm] Fix _get_args for typing.Tuple in Py3.5.2

[ehudm] Fix cleanPython race with :clean

[pabloem] Initialize logging configuration in Pipeline object

[pabloem] Initialize logging configuration in PipelineOptions object.

[lukasz.gajowy] [BEAM-5495] Make PipelineResourcesDetectorAbstractFactory an inner

[lukasz.gajowy] [BEAM-5495] Change detect() return type to List

[lukasz.gajowy] [BEAM-5495] Minor docs and test fixes

[mxm] [BEAM-8959] Invert metrics flag in Flink Runner

[lukasz.gajowy] [BEAM-5495] Re-add test verifying order of resources detection

[heejong] [BEAM-8902] parameterize input type of Java external transform

[lukasz.gajowy] [BEAM-5495] Prevent nested jar scanning (jarfiles in jarfiles)

[ehudm] Dicts are not valid DoFn.process return values

[chamikara] Makes environment ID a top level attribute of PTransform.

[angoenka] [BEAM-8944] Change to use single thread in py sdk bundle progress report

[aaltay] [BEAM-8335] Background caching job (#10405)

[pawel.pasterz] [BEAM-8978] Publish table size of data written during HadoopFormatIOIT

[mxm] [BEAM-8996] Auto-generate pipeline options documentation for FlinkRunner

[mxm] Regenerate Flink options table with the latest master


------------------------------------------
[...truncated 45.10 KB...]
2e517d68c391: Waiting
c28383f3aac4: Waiting
cbdd44c5cbac: Waiting
22b7de8c8281: Waiting
7ae12a0444ca: Pushed
bceae855adb2: Pushed
0b868092a3c9: Pushed
4dcb766a9c4f: Pushed
5d5bcc311400: Pushed
c28383f3aac4: Layer already exists
22b7de8c8281: Layer already exists
82d91760b0c3: Pushed
6f029f7fa589: Layer already exists
cbdd44c5cbac: Layer already exists
2e517d68c391: Layer already exists
5f3a5adb8e97: Layer already exists
73bfa217d66f: Layer already exists
91ecdd7165d3: Layer already exists
e4b20fcc48f4: Layer already exists
b7a1aafaf7f3: Pushed
6ec28f69df87: Pushed
f0706aefc3ce: Pushed
latest: digest: sha256:c8ff4e10d30e6d1e121d61dc3b402185c989ea8046528046c95a978a4f6546d5 size: 4110
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability -Pdocker-tag=latest :runners:flink:1.9:job-server-container:docker
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:flink:1.9:job-server:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:flink:1.9:job-server-container:dockerClean UP-TO-DATE
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:fn-execution:extractProto
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:flink:1.9:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:core:processResources
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:io:kafka:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :runners:core-construction-java:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.9:compileJava FROM-CACHE
> Task :runners:flink:1.9:classes
> Task :runners:flink:1.9:jar
> Task :runners:flink:1.9:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.9:job-server:classes UP-TO-DATE
> Task :runners:flink:1.9:job-server:shadowJar
> Task :runners:flink:1.9:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.9:job-server-container:dockerPrepare
> Task :runners:flink:1.9:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 2s
58 actionable tasks: 40 executed, 17 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/fugfkqbq47ox6

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5441446177706157517.sh
+ echo 'Tagging image...'
Tagging image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7960056027280697758.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3173555553397021840.sh
+ echo 'Pushing image...'
Pushing image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3619550950399134954.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server]
0b2afc5617b8: Preparing
205231f065eb: Preparing
adcdecaa4781: Preparing
2ee490fbc316: Preparing
b18043518924: Preparing
9a11244a7e74: Preparing
5f3a5adb8e97: Preparing
73bfa217d66f: Preparing
91ecdd7165d3: Preparing
e4b20fcc48f4: Preparing
9a11244a7e74: Waiting
5f3a5adb8e97: Waiting
73bfa217d66f: Waiting
91ecdd7165d3: Waiting
e4b20fcc48f4: Waiting
b18043518924: Layer already exists
2ee490fbc316: Layer already exists
9a11244a7e74: Layer already exists
5f3a5adb8e97: Layer already exists
91ecdd7165d3: Layer already exists
73bfa217d66f: Layer already exists
e4b20fcc48f4: Layer already exists
0b2afc5617b8: Pushed
adcdecaa4781: Pushed
205231f065eb: Pushed
latest: digest: sha256:e87b47dee5dd9da7f23dc130f6e3c80bb858b30503ddf60a1bb5f4e18d3193ad size: 2427
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
CLUSTER_NAME=beam-loadtests-python-gbk-flink-batch-189
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
FLINK_NUM_WORKERS=16
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-189
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7654084739639269443.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4930209717452215268.sh
+ cd <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-gbk-flink-batch-189-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                / [1 files][  2.3 KiB/  2.3 KiB]                                                Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                / [2 files][  6.0 KiB/  6.0 KiB]                                                Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                / [3 files][ 13.4 KiB/ 13.4 KiB]                                                -
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=17
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-189 --region=global --num-workers=17 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/ed60aa7f-0f4f-368e-a084-64bde7f7a8f8].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.create) Operation [projects/apache-beam-testing/regions/global/operations/ed60aa7f-0f4f-368e-a084-64bde7f7a8f8] failed: Initialization action timed out. Failed action 'gs://beam-flink-cluster/init-actions/flink.sh', see output in: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/1a00f1b7-4228-4f62-92e0-a7f34daffbc0/beam-loadtests-python-gbk-flink-batch-189-m/dataproc-initialization-script-2_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #188

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/188/display/redirect?page=changes>

Changes:

[kenn] Use more informative assertions in some py tests

[kcweaver] Add FlinkMiniClusterEntryPoint for testing the uber jar submission

[kcweaver] [BEAM-8512] Add integration tests for flink_runner.py.

[kcweaver] Build mini cluster jar excluding unwanted classes.

[kcweaver] Rename to testFlinkUberJarPyX.Y

[kcweaver] Increase timeout on beam_PostCommit_PortableJar_Flink.

[github] Update release guide for cherry picks (#10399)


------------------------------------------
[...truncated 266.55 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1219100250"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 62aa37201b19eb747c4d28b5cba4fbeb)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1576761862873_0001_01_000003 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1576761862873_0001_01_000003 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 90.051s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 34s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/64oqmymxpicau

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #187

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/187/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7390] Add code snippet for Max

[kamil.wasilewski] [BEAM-1440] Provide functions for waiting for BQ job and exporting

[kamil.wasilewski] [BEAM-1440] Create _BigQuerySource that implements iobase.BoundedSource

[kamil.wasilewski] [BEAM-1440] Reorganised BigQuery read IT tests

[kamil.wasilewski] [BEAM-1440] Create postCommitIT jobs running on Flink Runner

[kamil.wasilewski] [BEAM-1440] Convert strings to bytes on Python 3 if field type is BYTES

[kamil.wasilewski] [BEAM-1440]: Support RECORD fields in coder

[kamil.wasilewski] [BEAM-1440] Remove json files after reading

[kamil.wasilewski] [BEAM-1440] Marked classes as private

[kamil.wasilewski] [BEAM-1440] Do not force to create temp dataset when using dry run

[boyuanz] [BEAM-8536] Migrate using requested_execution_time to

[daniel.o.programmer] [BEAM-7970] Touch-up on Go protobuf generation instructions.

[kamil.wasilewski] [BEAM-8979] Remove mypy-protobuf dependency

[relax] Merge pull request #10311: [BEAM-8810] Detect stuck commits in

[github] [GoSDK] Make data channel splits idempotent (#10406)


------------------------------------------
[...truncated 266.26 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1218100242"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 18d319bc29251ce24697ab3770897644)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.ExecutionException: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:310)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
	at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
	at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status.asRuntimeException(Status.java:524)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:273)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:337)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:793)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

root: ERROR: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 58.015s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 2s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/jmyjhkrhwuzro

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #186

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/186/display/redirect?page=changes>

Changes:

[suztomo] [BEAM-8865] Updating Javadoc of FileIO example

[ningk] [BEAM-8837] Fix pcoll_visualization tests

[heejong] [BEAM-8904] properly update output pcollections from expanded transforms

[ningk] Use explicitly named dictionary when watching local scope variables in

[lukasz.gajowy] [BEAM-5495] Add possibility of scanning the classpath without

[lukasz.gajowy] [BEAM-5495] Enable injecting custom resources detection algorithms.

[lukasz.gajowy] [BEAM-5495] Move PipelineResources to

[mxm] [BEAM-8962] Add option to disable the metric container accumulator

[ningk] Avoid patching test pcoll data and use real logic to produce test data.

[lukasz.gajowy] [BEAM-5495] Remove unneded and deprecated code

[lukasz.gajowy] [BEAM-5495] Use ClassGraph to detect classpath resources

[github] [BEAM-8446] Retrying BQ query on timeouts (#9855)

[thw] [BEAM-8816] Option to load balance bundle processing w/ multiple SDK

[pabloem] Removing LINT.If/LINT.Then statement

[bhulette] [BEAM-8801] PubsubMessageToRow should not check useFlatSchema() in

[robertwb] [BEAM-8575] To test state backed iterable coder in py sdk. (#10143)

[robertwb] [BEAM-8575] Added two unit tests in CombineTest class to test

[daniel.o.programmer] [BEAM-7970] Rebuild Go protos with new protobuf version

[aaltay] [BEAM-7390] Add code snippet for Latest (#10166)

[chamikara] Merge pull request #10383: [BEAM-8575] Added a unit test to test that


------------------------------------------
[...truncated 267.68 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1217102102"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 3370a9da8454c0356995fcae7040da2f)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1576588819681_0001_01_000005 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1576588819681_0001_01_000005 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 64.130s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 8s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/qznj34nawt632

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #185

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/185/display/redirect?page=changes>

Changes:

[suztomo] Declare JSR305 dependency as 'shadow'


------------------------------------------
[...truncated 267.54 KB...]
    value {
      string_value: "[auto]"
    }
  }
  fields {
    key: "beam:option:flink_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1216124436"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: ca4b5500a97c568eb11801e74cc2d949)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1576501939057_0001_01_000003 timed out.
	at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1149)
	at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:109)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:397)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:190)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

root: ERROR: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_e01_1576501939057_0001_01_000003 timed out.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 65.646s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 10s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/skkn3mcfahfac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #184

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/184/display/redirect>

Changes:


------------------------------------------
[...truncated 267.68 KB...]
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1215100241"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 7de515ea85bee051b36a3ebdf9008b8f)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.lang.Exception: The data preparation for task 'GroupReduce (GroupReduce at GroupByKey 0)' , caused an error: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-184-w-3.c.apache-beam-testing.internal/10.128.0.110:33053'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:480)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: java.lang.RuntimeException: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-184-w-3.c.apache-beam-testing.internal/10.128.0.110:33053'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger.getIterator(UnilateralSortMerger.java:650)
	at org.apache.flink.runtime.operators.BatchTask.getInput(BatchTask.java:1109)
	at org.apache.flink.runtime.operators.GroupReduceDriver.prepare(GroupReduceDriver.java:99)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:474)
	... 4 more
Caused by: java.io.IOException: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-184-w-3.c.apache-beam-testing.internal/10.128.0.110:33053'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger$ThreadBase.run(UnilateralSortMerger.java:831)
Caused by: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-184-w-3.c.apache-beam-testing.internal/10.128.0.110:33053'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.io.network.netty.CreditBasedPartitionRequestClientHandler.channelInactive(CreditBasedPartitionRequestClientHandler.java:136)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:390)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:355)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1429)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:947)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:826)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
	at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:474)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909)
	at java.lang.Thread.run(Thread.java:748)

root: ERROR: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-184-w-3.c.apache-beam-testing.internal/10.128.0.110:33053'. This might indicate that the remote task manager was lost.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 23.567s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://scans.gradle.com/s/hjncq7rtarsii

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #183

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/183/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-8947] Change Flink docker image names in Jenkins jobs to match

[ehudm] [BEAM-3713] pytest migration: py3x-{gcp,cython}

[younghoono] [BEAM-7970] Improved error help in Go PROTOBUF.md

[robertwb] Fixed instructions and the example of running Interactive Beam on Flink.


------------------------------------------
[...truncated 151.64 KB...]
Operation completed over 3 objects/13.4 KiB.                                     
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-python-gbk-flink-batch-183 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation [projects/apache-beam-testing/regions/global/operations/7542ab1f-d3b7-3834-8a61-bedad6f2548c].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance.
..........................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-183] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-183-m '--command=yarn application -list'
++ grep beam-loadtests-python-gbk-flink-batch-183
Warning: Permanently added 'compute.2795878091033502135' (ECDSA) to the list of known hosts.
19/12/14 13:13:56 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-python-gbk-flink-batch-183-m/10.128.0.50:8032
+ read line
+ echo application_1576329159792_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911
application_1576329159792_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911
++ echo application_1576329159792_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911
++ sed 's/ .*//'
+ application_ids[$i]=application_1576329159792_0001
++ echo application_1576329159792_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911
++ sed 's/.*beam-loadtests-python-gbk-flink-batch-183/beam-loadtests-python-gbk-flink-batch-183/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911
+ echo 'Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911'
Using Yarn Application master: beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-183-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest --flink-master=beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-python-gbk-flink-batch-183'
60e1a7f37690ce49d9ab999cf55a9a4a247d16f3ad2642dac2e382971eac3b83
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-python-gbk-flink-batch-183-m '--command=curl -s "http://beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911/jobmanager/config"'
+ local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1576329159792_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-c1755b00-47cc-4eca-8e41-1d0433afca10"},{"key":"jobmanager.rpc.port","value":"39597"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1576329159792_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911
++ cut -d : -f1
+ local yarn_application_master_host=beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal
++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1576329159792_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-c1755b00-47cc-4eca-8e41-1d0433afca10"},{"key":"jobmanager.rpc.port","value":"39597"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1576329159792_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address++ python -c 'import sys, json; print [ e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
","value":"beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=39597
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-183-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911 -L 39597:beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39597 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-183-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911 -L 39597:beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39597 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-python-gbk-flink-batch-183-m -- -L 8081:beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39911 -L 39597:beam-loadtests-python-gbk-flink-batch-183-w-0.c.apache-beam-testing.internal:39597 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins8848763266173780123.sh
+ echo src Load test: 2GB of 10B records src
src Load test: 2GB of 10B records src
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -PloadTest.mainClass=apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey -Prunner=PortableRunner '-PloadTest.args=--job_name=load_tests_Python_Flink_Batch_GBK_1_1214100241 --publish_to_big_query=true --project=apache-beam-testing --metrics_dataset=load_test --metrics_table=python_flink_batch_GBK_1 --input_options='{"num_records": 200000000,"key_size": 1,"value_size":9}' --iterations=1 --fanout=1 --parallelism=5 --job_endpoint=localhost:8099 --environment_config=gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --environment_type=DOCKER --runner=PortableRunner' :sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv UP-TO-DATE

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python>
Requirement already satisfied: crcmod<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.3.1.1)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.25.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.0.0)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.16.5)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.10.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.11.1)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2.8.1)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (2019.3)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.6.8)
Requirement already satisfied: pyarrow<0.16.0,>=0.15.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.15.1)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.7.4.1)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.7.4.1)
Requirement already satisfied: cachetools<4,>=3.1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (3.1.1)
Requirement already satisfied: google-apitools<0.5.29,>=0.5.28 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.5.28)
Requirement already satisfied: google-cloud-datastore<1.8.0,>=1.7.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7.4)
Requirement already satisfied: google-cloud-pubsub<1.1.0,>=0.39.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.2)
Requirement already satisfied: google-cloud-bigquery<1.18.0,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.17.1)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.1.0)
Requirement already satisfied: google-cloud-bigtable<1.1.0,>=0.31.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.0.0)
Requirement already satisfied: googledatastore<7.1,>=7.0.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (7.0.2)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.90.4)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.4.1)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.7.0,>=0.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (0.6.3)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.9.0)
Requirement already satisfied: pyyaml<6.0.0,>=3.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (5.2)
Requirement already satisfied: requests_mock<2.0,>=1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.7.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (5.1.5)
Requirement already satisfied: pytest<5.0,>=4.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (4.6.7)
Requirement already satisfied: pytest-xdist<2,>=1.29.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.19.0.dev0) (1.30.0)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.19.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.19.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.19.0.dev0) (5.4.4)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (0.2.7)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.19.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.19.0.dev0) (42.0.2)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.19.0.dev0) (2.4.5)
Requirement already satisfied: fasteners>=0.14 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.19.0.dev0) (0.15)
Requirement already satisfied: google-api-core[grpc]<2.0.0dev,>=1.6.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.19.0.dev0) (1.14.3)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.19.0.dev0) (0.12.3)
Requirement already satisfied: google-resumable-media<0.5.0dev,>=0.3.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.19.0.dev0) (0.4.1)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.19.0.dev0) (1.6.0)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.19.0.dev0) (1.5)
Requirement already satisfied: atomicwrites>=1.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.3.0)
Requirement already satisfied: packaging in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (19.2)
Requirement already satisfied: wcwidth in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.1.7)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.3.0)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.8.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.13.1)
Requirement already satisfied: attrs>=17.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (19.3.0)
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (5.0.0)
Requirement already satisfied: pytest-forked in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.1.3)
Requirement already satisfied: execnet>=1.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.7.1)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (1.25.7)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2019.11.28)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.19.0.dev0) (2.8)
Requirement already satisfied: google-auth<2.0dev,>=0.4.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.19.0.dev0) (1.8.2)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.19.0.dev0) (1.5)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.19.0.dev0
    Not uninstalling apache-beam at <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python,> outside environment <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:232: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.19.0.dev' to '2.19.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... Terminated

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=4b96ec81-b578-44cc-994b-e619bd9a13eb, currentDir=<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 2806
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-2806.out.log
----- Last  20 lines from daemon log file - daemon-2806.out.log -----
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.19.0.dev' to '2.19.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... Terminated
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #182

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/182/display/redirect?page=changes>

Changes:

[suztomo] [BEAM-8911] New Guava version: 25.1-jre

[zyichi] [BEAM-8886] Add a mongodb io dataflow integration test

[heejong] [BEAM-8905] matching Java PCollectionTuple translation naming convention

[echauchot] [BEAM-8894] exclude FlattenWithHeterogeneousCoders category because

[echauchot] [BEAM-8025] Under load we get a NoHostAvailable exception at cluster

[github] [BEAM-8786] Fixes a link in readme

[echauchot] [BEAM-8025] Remove temporary folder rule because it suppresses files on

[echauchot] [BEAM-8025] Disable auto compaction on Cassandra node to avoid race

[pawel.pasterz] [BEAM-8946] Publish collection size of data written during MongoDBIOIT

[chamikara] Merge pull request #10347: [BEAM-8885] PubsubGrpcClient doesn't respect

[kenn] [BEAM-8917] jsr305 dependency declaration for Nullable class (#10324)

[xinyuliu.us] [BEAM-8342]: upgrade to samza 1.3.0 (#10357)

[aaltay] Make model_pcollection snippet self-contained (#10343)

[pabloem] Merge pull request #10050 from [BEAM-8575] Add streaming test case for

[github] Run beam_CancelStaleDataflowJobs every 4 hours.

[tvalentyn] [BEAM-8575] Added a unit test to test Combine works with sessions.

[github] [GoSDK] Improve StateChannel resilience. (#10363)

[tweise] [BEAM-8273] Expand portability environment documentation (#10116)


------------------------------------------
[...truncated 267.75 KB...]
      bool_value: false
    }
  }
  fields {
    key: "beam:option:flink_version:v1"
    value {
      string_value: "1.9"
    }
  }
  fields {
    key: "beam:option:gcs_performance_metrics:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_1213101743"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 9c0195229a203121c5606cda801adf24)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
	... 16 more
Caused by: java.lang.Exception: The data preparation for task 'GroupReduce (GroupReduce at GroupByKey 0)' , caused an error: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:480)
	at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
	... 1 more
Caused by: java.lang.RuntimeException: Error obtaining the sorted input: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger.getIterator(UnilateralSortMerger.java:650)
	at org.apache.flink.runtime.operators.BatchTask.getInput(BatchTask.java:1109)
	at org.apache.flink.runtime.operators.GroupReduceDriver.prepare(GroupReduceDriver.java:99)
	at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:474)
	... 4 more
Caused by: java.io.IOException: Thread 'SortMerger Reading Thread' terminated due to an exception: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.operators.sort.UnilateralSortMerger$ThreadBase.run(UnilateralSortMerger.java:831)
Caused by: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'. This might indicate that the remote task manager was lost.
	at org.apache.flink.runtime.io.network.netty.CreditBasedPartitionRequestClientHandler.channelInactive(CreditBasedPartitionRequestClientHandler.java:136)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:390)
	at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:355)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1429)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
	at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:947)
	at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:826)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
	at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:474)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909)
	at java.lang.Thread.run(Thread.java:748)

root: ERROR: org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: Connection unexpectedly closed by remote task manager 'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'. This might indicate that the remote task manager was lost.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 25.545s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://scans.gradle.com/s/cf5fizkmatwde

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #181

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/181/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-8830] flatten: better detect empty collections

[kcweaver] [BEAM-8139] Test Spark portable job jar.

[kcweaver] [BEAM-8139] Add main method to SparkPipelineRunner.

[kcweaver] [BEAM-8139] Add beam_PostCommit_PortableJar_Spark.

[echauchot] [BEAM-8830] Fix empty dataset creation: use the beam coder wrapper that

[suztomo] BEAM-8693 com.google.cloud.datastore:datastore-v1-proto-client 1.6.3

[suztomo] google_cloud_bigquery is not declaring google_http_client_jackson

[chadrik] Pass artifact and provision endpoints to external workers from python

[apilloud] [BEAM-8362] Don't use toString() for accessing Enum Types

[echauchot] [BEAM-8830] Exclude failing SDF tests

[lostluck] [BEAM-8920] Go SDK: faster transforms/filter.Distinct with CombinePerKey

[ehudm] [BEAM-3713] pytest migration: py27-cython-pytest

[aaltay] [BEAM-8811] Upgrade Beam pipeline diagrams in docs (#10200)

[bhulette] [BEAM-8933] Update BigQuery proto dependency (#10334)

[pabloem] [BEAM-7746] Add python type hints (part 1) (#9915)

[heejong] [BEAM-8943] SDK harness servers don't shut down properly when SDK

[github] [BEAM-8955] Exclude AvroSchemaTest for Spark Runner (#10358)

[relax] attach values so we don't iterate over everything to verify types


------------------------------------------
[...truncated 37.76 KB...]
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :sdks:python:container:py2:copyDockerfileDependencies

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: commit='386d4e5f4f92f86e6aec85985761bba4b938a2d5', urls=[https://code.googlesource.com/google-api-go-client]
Resolving google.golang.org/genproto: commit='2b5a72b8730b0b16380010cfe5286c42108d88e7', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='7646b5360d049a7ca31e9133315db43456f39e2e', urls=[https://github.com/grpc/grpc-go]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:go:installDependencies
> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild

> Task :sdks:python:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/go>

> Task :sdks:python:container:installDependencies
> Task :sdks:python:container:buildDarwinAmd64
> Task :sdks:python:container:buildLinuxAmd64
> Task :sdks:python:container:goBuild
> Task :sdks:python:container:py2:copyLauncherDependencies
> Task :sdks:python:container:py2:dockerPrepare
> Task :sdks:python:container:py2:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 6m 35s
18 actionable tasks: 17 executed, 1 up-to-date

Publishing build scan...
https://scans.gradle.com/s/u7kyhko2oyla4

Build step 'Invoke Gradle script' changed build result to SUCCESS
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6730970150396342327.sh
+ echo 'Tagging image...'
Tagging image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1852381186790434017.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/python2.7_sdk gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2983378696994595922.sh
+ echo 'Pushing image...'
Pushing image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3892506087671191257.sh
+ docker push gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest
The push refers to repository [gcr.io/apache-beam-testing/beam_portability/python2.7_sdk]
6411f59b62cb: Preparing
d6920fccae0c: Preparing
f4f5b53194e3: Preparing
1d1bc28e1102: Preparing
66bf7e315339: Preparing
a7cc192cc693: Preparing
2fae77f261a0: Preparing
44f4885eef1d: Preparing
fdd18ad126a3: Preparing
c28383f3aac4: Preparing
22b7de8c8281: Preparing
6f029f7fa589: Preparing
cbdd44c5cbac: Preparing
2e517d68c391: Preparing
5f3a5adb8e97: Preparing
73bfa217d66f: Preparing
91ecdd7165d3: Preparing
e4b20fcc48f4: Preparing
fdd18ad126a3: Waiting
6f029f7fa589: Waiting
2fae77f261a0: Waiting
a7cc192cc693: Waiting
44f4885eef1d: Waiting
c28383f3aac4: Waiting
73bfa217d66f: Waiting
22b7de8c8281: Waiting
91ecdd7165d3: Waiting
5f3a5adb8e97: Waiting
e4b20fcc48f4: Waiting
66bf7e315339: Pushed
1d1bc28e1102: Pushed
d6920fccae0c: Pushed
a7cc192cc693: Pushed
6411f59b62cb: Pushed
c28383f3aac4: Layer already exists
44f4885eef1d: Pushed
22b7de8c8281: Layer already exists
6f029f7fa589: Layer already exists
cbdd44c5cbac: Layer already exists
2e517d68c391: Layer already exists
5f3a5adb8e97: Layer already exists
73bfa217d66f: Layer already exists
91ecdd7165d3: Layer already exists
e4b20fcc48f4: Layer already exists
fdd18ad126a3: Pushed
f4f5b53194e3: Pushed
2fae77f261a0: Pushed
latest: digest: sha256:75dd797ee632c662076e847126bae64e8152b9b92ec4a8288d13b7b87af110ab size: 4110
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability -Pdocker-tag=latest :runners:flink:1.9:job-server-container:docker
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:flink:1.9:job-server:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:flink:1.9:job-server-container:dockerClean UP-TO-DATE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:fn-execution:extractProto
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :runners:flink:1.9:copySourceOverrides
> Task :model:fn-execution:processResources
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :runners:flink:1.9:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:core:processResources
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:io:kafka:jar
> Task :runners:core-construction-java:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-java:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.9:compileJava FROM-CACHE
> Task :runners:flink:1.9:classes
> Task :runners:flink:1.9:jar
> Task :runners:flink:1.9:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.9:job-server:classes UP-TO-DATE
> Task :runners:flink:1.9:job-server:shadowJar
> Task :runners:flink:1.9:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.9:job-server-container:dockerPrepare
> Task :runners:flink:1.9:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 3s
58 actionable tasks: 40 executed, 17 from cache, 1 up-to-date

Publishing build scan...
https://scans.gradle.com/s/6swloocahpraa

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins7634343505637831474.sh
+ echo 'Tagging image...'
Tagging image...
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe /tmp/jenkins4863665693933392586.sh
+ docker tag gcr.io/apache-beam-testing/beam_portability/flink-job-server gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
Error response from daemon: No such image: gcr.io/apache-beam-testing/beam_portability/flink-job-server:latest
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Python_GBK_Flink_Batch - Build # 180 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #180)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/180/ to view the results.

beam_LoadTests_Python_GBK_Flink_Batch - Build # 179 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_LoadTests_Python_GBK_Flink_Batch (build #179)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/179/ to view the results.