You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/10/06 07:41:05 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #4444

See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4444/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15602 from [BEAM-10917] Add support for BigQuery


------------------------------------------
[...truncated 78.00 MB...]
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1633501625
  nanos: 506383895
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633501625
  nanos: 642156600
}
message: "Renamed 1 shards in 0.14 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633501625
  nanos: 650502681
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:256"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633501625
  nanos: 650734424
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633501625
  nanos: 650862693
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633501625
  nanos: 651022434
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:902"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633501625
  nanos: 651531457
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633501625
  nanos: 651685476
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 7.890704154968262 seconds.
INFO:root:Successfully completed job in 7.890704154968262 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:45345
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f319a0317b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f319a031840> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f319a031f28> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temp45tv0m79/artifactsmsiq5cyi' '--job-port' '41473' '--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:41473.
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:13 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:37329'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:13 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:44117'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:13 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:41473'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:13 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:41473.
WARNING:root:Waiting for grpc channel to be ready at localhost:41473.
WARNING:root:Waiting for grpc channel to be ready at localhost:41473.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_bce737e6-e997-4097-8467-f1388206da96.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_bce737e6-e997-4097-8467-f1388206da96.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_bce737e6-e997-4097-8467-f1388206da96.null.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_bce737e6-e997-4097-8467-f1388206da96.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:19 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1006062719-50f30094_8bf94f48-d0df-4bf1-9924-7f565869201f'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:19 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1006062719-50f30094_8bf94f48-d0df-4bf1-9924-7f565869201f'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:20 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:21 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:24 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:24 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:24 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1006062719-50f30094_8bf94f48-d0df-4bf1-9924-7f565869201f on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:45437.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:35019.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:44557
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1006062719-50f30094_8bf94f48-d0df-4bf1-9924-7f565869201f: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.14 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/10/06 06:27:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1006062719-50f30094_8bf94f48-d0df-4bf1-9924-7f565869201f finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633501648.678487910","description":"Error received from peer ipv4:127.0.0.1:44557","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633501648.678487910","description":"Error received from peer ipv4:127.0.0.1:44557","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1033, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633501648.678496778","description":"Error received from peer ipv4:127.0.0.1:35019","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 246, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633501648.678528949","description":"Error received from peer ipv4:127.0.0.1:45437","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>


FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 40m 31s
211 actionable tasks: 147 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ghs55uot67na6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #4454

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4454/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #4453

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4453/display/redirect>

Changes:


------------------------------------------
[...truncated 51.52 MB...]
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2109: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2395: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2397: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2428: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/dataframe/io.py:572
apache_beam/dataframe/io.py:572
apache_beam/dataframe/io.py:572
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/dataframe/io.py>:572: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2099: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/avro/schema.py>:1251
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/avro/schema.py>:1251: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    DeprecationWarning)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:181
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:181: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:162
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:162: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:126
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:126: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2569: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2570
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2570: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2583
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2583: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:124: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:113
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:113: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/pytest_postCommitIT-df-py36.xml> -
======= 1 failed, 62 passed, 11 skipped, 183 warnings in 5637.74 seconds =======

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 37m 25s
211 actionable tasks: 147 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/son2qb7luuslq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #4452

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4452/display/redirect?page=changes>

Changes:

[Udi Meiri] Release script fixes


------------------------------------------
[...truncated 24.60 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem (15/16)#0 3e4e1f7a40b6382f334a5ea51248b2e8.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (11/16)#0 27cb6e071353f3315ea37c7ea7b837a7.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (5/16) (f7026658732c76dd966d4d8f0225645a) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem (2/16)#0 5878f09bf848a0be73739354e281be8e.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (7/16) (7a21416725ebec8a4089b2482819900e) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (12/16)#0 eb01b1babd6bbbbf480d631ac73d05ca.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (13/16) (d02300f1cd9f4333113c17add106b9a3) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (14/16)#0 4faf30e5cf5a7e0a1699ce9f8d3af23a.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (13/16) (f409bf966510f07790941bae1fe82116) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem (4/16)#0 47323f90d98398bd028cba618bbcd704.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem (14/16)#0 3627c3b2c9dc945fc2b275aaaf0d1e93.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (3/16) (3f259cb648fbb955826d9986631f3d2c) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem (16/16)#0 ca9bbaab1b678aa6882de52c1b289034.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (9/16) (2f6c4951ea9ec48a262586cebea801e3) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem (1/16)#0 4f0915226ba599bcf5dce12ec0843b1b.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem (11/16)#0 c1585bc3543a1c18a1fdc87ef20eb24e.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (10/16)#0 0e1fc04cae27f306c2a75ba95ef18d87.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (15/16) (3e4e1f7a40b6382f334a5ea51248b2e8) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (11/16) (27cb6e071353f3315ea37c7ea7b837a7) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem (10/16)#0 65b8d8fbc8d00691c655b9e501f0afb9.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (2/16) (5878f09bf848a0be73739354e281be8e) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (16/16)#0 631cba9724ca0181458fad2926522be5.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (1/16)#0 cbf49da408d2bed8f33f8677e2a2e0f0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (12/16) (eb01b1babd6bbbbf480d631ac73d05ca) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.taskexecutor.TaskExecutor unregisterTaskAndNotifyFinalState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem (5/16)#0 457f8a7a87303b999bf3e60b1dc5196a.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (14/16) (4faf30e5cf5a7e0a1699ce9f8d3af23a) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (4/16) (47323f90d98398bd028cba618bbcd704) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (14/16) (3627c3b2c9dc945fc2b275aaaf0d1e93) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (16/16) (ca9bbaab1b678aa6882de52c1b289034) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (1/16) (4f0915226ba599bcf5dce12ec0843b1b) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (11/16) (c1585bc3543a1c18a1fdc87ef20eb24e) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (10/16) (0e1fc04cae27f306c2a75ba95ef18d87) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (10/16) (65b8d8fbc8d00691c655b9e501f0afb9) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (16/16) (631cba9724ca0181458fad2926522be5) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: [1]Read from debezium/ExternalTransform(beam:external:java:debezium:read:v1)/ParDo(KafkaSourceConsumer)/ParMultiDo(KafkaSourceConsumer)/ProcessSizedElementsAndRestrictions0 -> [5]{Read from debezium, assert_that} (1/16) (cbf49da408d2bed8f33f8677e2a2e0f0) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.flink.runtime.executiongraph.Execution transitionState'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: ToKeyedWorkItem (5/16) (457f8a7a87303b999bf3e60b1dc5196a) switched from RUNNING to FINISHED.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.whitelist" is deprecated and will be removed in future versions. Please use "schema.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.blacklist" is deprecated and will be removed in future versions. Please use "schema.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.whitelist" is deprecated and will be removed in future versions. Please use "schema.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.blacklist" is deprecated and will be removed in future versions. Please use "schema.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 29779206'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 39362'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.whitelist" is deprecated and will be removed in future versions. Please use "schema.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.blacklist" is deprecated and will be removed in future versions. Please use "schema.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:18 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.blacklist" is deprecated and will be removed in future versions. Please use "schema.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.whitelist" is deprecated and will be removed in future versions. Please use "schema.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.blacklist" is deprecated and will be removed in future versions. Please use "schema.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.blacklist" is deprecated and will be removed in future versions. Please use "table.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "schema.whitelist" is deprecated and will be removed in future versions. Please use "schema.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.whitelist" is deprecated and will be removed in future versions. Please use "column.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "column.blacklist" is deprecated and will be removed in future versions. Please use "column.exclude.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: user 'debezium' connected to database 'inventory' on PostgreSQL 11.13 (Debian 11.13-1.pgdg90+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 6.3.0-18+deb9u1) 6.3.0 20170516, 64-bit with roles:"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_all_settings' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_stat_scan_tables' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_write_server_files' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'debezium' [superuser: true, replication: true, inherit: true, create role: true, create db: true, can log in: true]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_monitor' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_server_files' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_execute_server_program' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_all_stats' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_signal_backend' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=null, catalogXmin=null]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No previous offset found'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating replication slot with command CREATE_REPLICATION_SLOT debezium  LOGICAL decoderbufs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = change-event-source-coordinator'

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT

[gw0] PASSED apache_beam/io/gcp/bigquery_io_read_it_test.py::BigqueryIOReadIT::test_bigquery_read_custom_1M_python 
apache_beam/io/gcp/gcsio_integration_test.py::GcsIOIntegrationTest::test_copy 
[gw0] PASSED apache_beam/io/gcp/gcsio_integration_test.py::GcsIOIntegrationTest::test_copy 
apache_beam/io/gcp/gcsio_integration_test.py::GcsIOIntegrationTest::test_copy_batch 
[gw0] PASSED apache_beam/io/gcp/gcsio_integration_test.py::GcsIOIntegrationTest::test_copy_batch 
apache_beam/io/gcp/gcsio_integration_test.py::GcsIOIntegrationTest::test_copy_batch_kms 
[gw0] PASSED apache_beam/io/gcp/gcsio_integration_test.py::GcsIOIntegrationTest::test_copy_batch_kms 
apache_beam/io/gcp/gcsio_integration_test.py::GcsIOIntegrationTest::test_copy_batch_rewrite_token 
[gw0] SKIPPED apache_beam/io/gcp/gcsio_integration_test.py::GcsIOIntegrationTest::test_copy_batch_rewrite_token 
apache_beam/io/gcp/gcsio_integration_test.py::GcsIOIntegrationTest::test_copy_kms 
> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Metrics registered'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Context created'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: According to the connector configuration data will be snapshotted'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 1 - Preparing'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Setting isolation level'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Opening transaction with statement SET TRANSACTION ISOLATION LEVEL SERIALIZABLE, READ ONLY, DEFERRABLE;'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 2 - Determining captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'WARNING: Using configuration property "table.whitelist" is deprecated and will be removed in future versions. Please use "table.include.list" instead.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 3 - Locking captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Waiting a maximum of '10' seconds for each table lock"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 4 - Determining snapshot offset'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating initial offset context'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2078000}' from transaction '602'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2078000}' from transaction '602'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 5 - Reading structure of captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Reading structure of schema 'inventory'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:20 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 6 - Persisting schema history'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:20 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'

> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Metrics registered'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Context created'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 08, 2021 6:46:19 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
java.lang.OutOfMemoryError: GC overhead limit exceeded
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #4451

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4451/display/redirect?page=changes>

Changes:

[noreply] Revert "[BEAM-12993] Update to Debezium 1.7.0.Final (#15636)"

[kawaigin] Updated screendiff integration test golden screenshots.

[noreply] [BEAM-12769] Few fixes related to Java Class Lookup based cross-language


------------------------------------------
[...truncated 20.61 MB...]
INFO:root:severity: INFO
timestamp {
  seconds: 1633652634
  nanos: 725547313
}
message: "Renamed 1 shards in 0.12 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633652634
  nanos: 750567197
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:256"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633652634
  nanos: 750779867
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633652634
  nanos: 750883340
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633652634
  nanos: 750987529
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:902"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633652634
  nanos: 751507520
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633652634
  nanos: 751646041
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 25.54339075088501 seconds.
INFO:root:Successfully completed job in 25.54339075088501 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:38489
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.35.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fa9fab9d950> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fa9fab9d9d8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fa9fab9e158> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.35.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-templzjdgt_s/artifactstl339hy6' '--job-port' '45005' '--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:45005.
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:03 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:43231'
WARNING:root:Waiting for grpc channel to be ready at localhost:45005.
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:03 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:40131'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:03 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:45005'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:03 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:45005.
WARNING:root:Waiting for grpc channel to be ready at localhost:45005.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:08 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_d3f55d20-3ee0-4f87-94fb-07d006d3d1b1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:08 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_d3f55d20-3ee0-4f87-94fb-07d006d3d1b1.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:08 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_d3f55d20-3ee0-4f87-94fb-07d006d3d1b1.null.'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:08 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_d3f55d20-3ee0-4f87-94fb-07d006d3d1b1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:09 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1008002409-1a5d1202_7ef23aa1-6d14-4b5e-b5c7-f3ae155ac843'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:09 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1008002409-1a5d1202_7ef23aa1-6d14-4b5e-b5c7-f3ae155ac843'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:10 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:12 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b"21/10/08 00:24:15 WARN org.apache.spark.util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041."
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:16 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:16 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1008002409-1a5d1202_7ef23aa1-6d14-4b5e-b5c7-f3ae155ac843 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:41321.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37689.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:43405
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:18 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1008002409-1a5d1202_7ef23aa1-6d14-4b5e-b5c7-f3ae155ac843: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.15 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/10/08 00:24:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1008002409-1a5d1202_7ef23aa1-6d14-4b5e-b5c7-f3ae155ac843 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633652660.541537978","description":"Error received from peer ipv4:127.0.0.1:43405","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633652660.541537978","description":"Error received from peer ipv4:127.0.0.1:43405","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1033, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633652660.541548988","description":"Error received from peer ipv4:127.0.0.1:37689","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 246, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633652660.541553491","description":"Error received from peer ipv4:127.0.0.1:41321","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 49m 48s
211 actionable tasks: 154 executed, 53 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ffnvyewwkddic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #4450

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4450/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015] Remove the overhead of SpecMonitoringInfoValidator

[noreply] Minor: Replace generic external.py links in multi-language documentation


------------------------------------------
[...truncated 29.98 MB...]
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2109: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2395: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2397: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2428: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/dataframe/io.py:572
apache_beam/dataframe/io.py:572
apache_beam/dataframe/io.py:572
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/dataframe/io.py>:572: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2099: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/avro/schema.py>:1251
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/avro/schema.py>:1251: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    DeprecationWarning)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:181
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:181: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:162
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:162: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:126
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:126: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2569: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2570
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2570: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2583
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2583: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:124: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:113
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:113: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/pytest_postCommitIT-df-py36.xml> -
======= 1 failed, 62 passed, 11 skipped, 180 warnings in 5837.41 seconds =======

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 39m 54s
211 actionable tasks: 159 executed, 48 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/7p4iigtqcapse

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #4449

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4449/display/redirect>

Changes:


------------------------------------------
[...truncated 81.40 MB...]
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2109: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2395: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2397: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
apache_beam/io/gcp/bigquery.py:2428
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2428: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/dataframe/io.py:572
apache_beam/dataframe/io.py:572
apache_beam/dataframe/io.py:572
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/dataframe/io.py>:572: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2099: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/avro/schema.py>:1251
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/avro/schema.py>:1251: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    DeprecationWarning)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:124: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2569: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2570
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2570: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2583
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2583: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:113
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:113: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:181
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:181: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:162
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:162: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:126
  <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:126: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/pytest_postCommitIT-df-py36.xml> -
============ 63 passed, 11 skipped, 186 warnings in 6962.37 seconds ============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 59m 56s
211 actionable tasks: 147 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/a5fit42uxjtn4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #4448

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4448/display/redirect?page=changes>

Changes:

[david.prieto] [BEAM-12950] Not delete orphaned files to avoid missing events

[david.prieto] [BEAM-12950] Add Bug fix description to CHANGES.md

[david.prieto] [BEAM-12950] fix linter issues

[david.prieto] [BEAN-12950] Skip unit test


------------------------------------------
[...truncated 54.25 MB...]
thread: "Thread-14"

INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3222-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:root:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3222-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1633588555
  nanos: 611454963
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633588555
  nanos: 766582012
}
message: "Renamed 1 shards in 0.15 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633588555
  nanos: 787149667
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:256"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633588555
  nanos: 787340402
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633588555
  nanos: 787430286
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633588555
  nanos: 787506103
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:902"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633588555
  nanos: 788505315
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633588555
  nanos: 788610458
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 18.605895042419434 seconds.
INFO:root:Successfully completed job in 18.605895042419434 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:46643
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.35.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f8af3a0e6a8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f8af3a0e730> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f8af3a0ee18> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.35.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempudd7zk66/artifactsmm4b8hij' '--job-port' '46345' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:42947'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:40161'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:46345'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:01 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_d7b4a543-57a4-496d-b191-1944ca11763c.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:01 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_d7b4a543-57a4-496d-b191-1944ca11763c.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:01 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_d7b4a543-57a4-496d-b191-1944ca11763c.null.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:01 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_d7b4a543-57a4-496d-b191-1944ca11763c.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:01 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1007063601-2128a7c4_b6b61325-dd6a-4abe-a95d-d2fc1725b151'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:01 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1007063601-2128a7c4_b6b61325-dd6a-4abe-a95d-d2fc1725b151'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:01 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:02 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:02 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:02 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1007063601-2128a7c4_b6b61325-dd6a-4abe-a95d-d2fc1725b151 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:41535.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:45273.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:46043
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1007063601-2128a7c4_b6b61325-dd6a-4abe-a95d-d2fc1725b151: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.17 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/10/07 06:36:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1007063601-2128a7c4_b6b61325-dd6a-4abe-a95d-d2fc1725b151 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633588565.494371723","description":"Error received from peer ipv4:127.0.0.1:46043","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633588565.494371723","description":"Error received from peer ipv4:127.0.0.1:46043","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 246, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633588565.494408583","description":"Error received from peer ipv4:127.0.0.1:41535","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1033, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633588565.494396456","description":"Error received from peer ipv4:127.0.0.1:45273","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>


FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 43m 11s
211 actionable tasks: 153 executed, 54 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/2n5lkb6diohyy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python36 - Build # 4447 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python36 - Build # 4447 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python36/4447/ to view the results.

beam_PostCommit_Python36 - Build # 4446 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python36 - Build # 4446 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python36/4446/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python36 #4445

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4445/display/redirect>

Changes:


------------------------------------------
[...truncated 51.14 MB...]

INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Impulse_15)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2965-_16))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Map-decode-_18))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:root:Running (((((ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Impulse_15)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2965-_16))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Map-decode-_18))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1633523613
  nanos: 848572969
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523613
  nanos: 994351387
}
message: "Renamed 1 shards in 0.15 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523614
  nanos: 23498773
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:256"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523614
  nanos: 23718595
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523614
  nanos: 23811817
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523614
  nanos: 23946046
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:902"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523614
  nanos: 25882959
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523614
  nanos: 26071786
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 5.947239875793457 seconds.
INFO:root:Successfully completed job in 5.947239875793457 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:42719
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fe18730b7b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fe18730b840> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fe18730bf28> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempxb1b86jp/artifactsgknbapzk' '--job-port' '44611' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:41 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:37051'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:41 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:35551'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:41 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:44611'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:41 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:44611.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:43 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_53215139-bf33-4b64-9a1b-7c7bb3bf22ea.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:43 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_53215139-bf33-4b64-9a1b-7c7bb3bf22ea.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:43 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_53215139-bf33-4b64-9a1b-7c7bb3bf22ea.null.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:43 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_53215139-bf33-4b64-9a1b-7c7bb3bf22ea.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:43 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1006123343-3a1971e_dae3d6fa-5a40-41a6-9f06-fe1077c5313e'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:43 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1006123343-3a1971e_dae3d6fa-5a40-41a6-9f06-fe1077c5313e'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:43 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:44 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:45 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:45 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:45 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1006123343-3a1971e_dae3d6fa-5a40-41a6-9f06-fe1077c5313e on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:41083.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:33155.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:39939
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1006123343-3a1971e_dae3d6fa-5a40-41a6-9f06-fe1077c5313e: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:33:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1006123343-3a1971e_dae3d6fa-5a40-41a6-9f06-fe1077c5313e finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 246, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633523627.927301758","description":"Error received from peer ipv4:127.0.0.1:41083","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633523627.927249460","description":"Error received from peer ipv4:127.0.0.1:39939","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633523627.927249460","description":"Error received from peer ipv4:127.0.0.1:39939","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1033, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633523627.927278117","description":"Error received from peer ipv4:127.0.0.1:33155","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>


FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 40m 35s
211 actionable tasks: 147 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/lmiqt6ww6lzzm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org