You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/03/31 23:35:17 UTC
Build failed in Jenkins: beam_PostCommit_Python35 #2132
See <https://builds.apache.org/job/beam_PostCommit_Python35/2132/display/redirect?page=changes>
Changes:
[robertwb] [BEAM-9577] Rename the Artifact{Staging,Retrieval}Service.
[robertwb] [BEAM-9577] Define the new Artifact{Staging,Retrieval}Service.
[robertwb] [BEAM-9577] Regenerate protos.
[robertwb] [BEAM-9577] Implement the new Artifact{Staging,Retrieval}Services in
------------------------------------------
[...truncated 9.47 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:03.031Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests2/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:03.041Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:03.076Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:03.111Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/Flatten.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:03.141Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:05.518Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:05.579Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:05.644Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:05.704Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:05.767Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:05.840Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:06.183Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:08.521Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:08.590Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:08.634Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:08.707Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:10.708Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:10.792Z: JOB_MESSAGE_DEBUG: Executing success step success54
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:10.907Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:10.956Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:10.985Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:17.929Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2020-03-31_16_30_18-7737913159561168083 after 182 seconds
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:41.432Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:41.467Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:55.955Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:56.006Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:56.045Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-31_16_25_43-18333156156785595030 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:58.794Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:58.821Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Start verify Bigquery table properties.
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Table proto is <Table
clustering: <Clustering
fields: ['language']>
creationTime: 1585697558255
etag: 'ZsFejVU+exdBAYfvnhOqng=='
id: 'apache-beam-testing:python_bq_streaming_inserts_15856971201898.output_table1'
kind: 'bigquery#table'
lastModifiedTime: 1585697558510
location: 'US'
numBytes: 0
numLongTermBytes: 0
numRows: 0
schema: <TableSchema
fields: [<TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'name'
type: 'STRING'>, <TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'language'
type: 'STRING'>]>
selfLink: 'https://bigquery.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_bq_streaming_inserts_15856971201898/tables/output_table1'
streamingBuffer: <Streamingbuffer
estimatedBytes: 98
estimatedRows: 8
oldestEntryTime: 1585697520000>
tableReference: <TableReference
datasetId: 'python_bq_streaming_inserts_15856971201898'
projectId: 'apache-beam-testing'
tableId: 'output_table1'>
timePartitioning: <TimePartitioning
type: 'DAY'>
type: 'TABLE'>
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Matching {'type': 'DAY'} to <TimePartitioning
type: 'DAY'>
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Matching DAY to DAY
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Matching {'fields': ['language']} to <Clustering
fields: ['language']>
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Matching ['language'] to ['language']
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Start verify Bigquery table properties.
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Table proto is <Table
clustering: <Clustering
fields: ['language']>
creationTime: 1585697575027
etag: 'E1wahir3Z3KLhptOfPqv4A=='
id: 'apache-beam-testing:python_bq_streaming_inserts_15856971201898.output_table2'
kind: 'bigquery#table'
lastModifiedTime: 1585697575027
location: 'US'
numBytes: 98
numLongTermBytes: 0
numRows: 8
schema: <TableSchema
fields: [<TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'language'
type: 'STRING'>, <TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'name'
type: 'STRING'>]>
selfLink: 'https://bigquery.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_bq_streaming_inserts_15856971201898/tables/output_table2'
tableReference: <TableReference
datasetId: 'python_bq_streaming_inserts_15856971201898'
projectId: 'apache-beam-testing'
tableId: 'output_table2'>
timePartitioning: <TimePartitioning
type: 'DAY'>
type: 'TABLE'>
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Matching {'type': 'DAY'} to <TimePartitioning
type: 'DAY'>
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Matching DAY to DAY
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Matching {'fields': ['language']} to <Clustering
fields: ['language']>
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Matching ['language'] to ['language']
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT name, language FROM python_bq_streaming_inserts_15856971201898.output_table1 to BQ
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/c51252cb-0243-4247-86c4-6c4d247b379d?location=US&maxResults=0 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon38ae0ead_f19f_4dbe_867f_59f24e83dc81/data HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [('beam', 'java'), ('flink', 'scala'), ('beam', 'py'), ('spark', 'py'), ('beam', 'go'), ('spark', 'scala'), ('flink', 'java'), ('spark', 'scala')]
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT name, language FROM python_bq_streaming_inserts_15856971201898.output_table2 to BQ
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/43bedf83-6ff7-4517-98d3-7bffe6ae45d1?location=US&maxResults=0 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon1f18711fe91b33abe0c8ab4230b6da6d0e47ef3f/data HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [('beam', 'py'), ('beam', 'go'), ('beam', 'java'), ('flink', 'java'), ('flink', 'scala'), ('spark', 'py'), ('spark', 'scala'), ('spark', 'scala')]
INFO:apache_beam.io.gcp.bigquery_test:Deleting dataset python_bq_streaming_inserts_15856971201898 in project apache-beam-testing
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python35/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py:79: FutureWarning: MaskDetectedDetails is experimental.
inspection_config=INSPECT_CONFIG))
WARNING:apache_beam.runners.dataflow.dataflow_runner:Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.portability.stager:Executing command: ['/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python35/src/build/gradleenv/-1734967054/bin/python', '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.164Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_10128494505000200159". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_10128494505000200159".
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python35/src/sdks/python/build/apache-beam.tar.gz" to staging location.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/requirements.txt...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/requirements.txt in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/six-1.14.0.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/six-1.14.0.tar.gz in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/parameterized-0.7.1.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/parameterized-0.7.1.tar.gz in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/mock-2.0.0.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/mock-2.0.0.tar.gz in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/pbr-5.4.4.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/pbr-5.4.4.tar.gz in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/funcsigs-1.0.2.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/funcsigs-1.0.2.tar.gz in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/PyHamcrest-1.10.1.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/PyHamcrest-1.10.1.tar.gz in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233508-768759.1585697708.768967/dataflow-worker.jar...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:14.334Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
FATAL: command execution failed
java.io.IOException: Backing channel 'JNLP4-connect connection from 140.111.238.35.bc.googleusercontent.com/35.238.111.140:47762' is disconnected.
at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:214)
at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
at com.sun.proxy.$Proxy135.isAlive(Unknown Source)
at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1150)
at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1142)
at hudson.Launcher$ProcStarter.join(Launcher.java:470)
at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741)
at hudson.model.Build$BuildExecution.build(Build.java:206)
at hudson.model.Build$BuildExecution.doRun(Build.java:163)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
at hudson.model.Run.execute(Run.java:1815)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: java.nio.channels.ClosedChannelException
at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
at org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
at org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
at org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
at hudson.remoting.Channel.close(Channel.java:1452)
at hudson.remoting.Channel.close(Channel.java:1405)
at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:847)
at hudson.slaves.SlaveComputer.kill(SlaveComputer.java:814)
at hudson.model.AbstractCIBase.killComputer(AbstractCIBase.java:89)
at jenkins.model.Jenkins.access$2100(Jenkins.java:312)
at jenkins.model.Jenkins$19.run(Jenkins.java:3464)
at hudson.model.Queue._withLock(Queue.java:1379)
at hudson.model.Queue.withLock(Queue.java:1256)
at jenkins.model.Jenkins._cleanUpDisconnectComputers(Jenkins.java:3458)
at jenkins.model.Jenkins.cleanUp(Jenkins.java:3336)
at hudson.WebAppMain.contextDestroyed(WebAppMain.java:379)
at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4732)
at org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5396)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1400)
at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1389)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
at org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:976)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1400)
at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1389)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
at org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:976)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at org.apache.catalina.core.StandardService.stopInternal(StandardService.java:473)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at org.apache.catalina.core.StandardServer.stopInternal(StandardServer.java:994)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at org.apache.catalina.startup.Catalina.stop(Catalina.java:706)
at org.apache.catalina.startup.Catalina.start(Catalina.java:668)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:344)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:475)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-1 is offline; cannot locate JDK 1.8 (latest)
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Python35 #2133
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/2133/display/redirect>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org