You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/10/04 19:47:46 UTC

Build failed in Jenkins: beam_PostCommit_Python38 #1761

See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1761/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11516] Upgrade to pylint 2.11.1, fix warnings (#15612)

[noreply] [BEAM-12979, BEAM-11097] Change cache to store ReStreams, clean up to…

[noreply] [BEAM-3304, BEAM-12513] Trigger changes and Windowing. (#15644)


------------------------------------------
[...truncated 65.78 MB...]
  seconds: 1633376653
  nanos: 620733499
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:297"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633376653
  nanos: 668570518
}
message: "Renamed 1 shards in 0.05 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:345"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633376653
  nanos: 672554492
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:256"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633376653
  nanos: 672692537
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633376653
  nanos: 672781705
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633376653
  nanos: 672845125
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:902"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633376653
  nanos: 674208164
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633376653
  nanos: 674318790
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 6.976028919219971 seconds.
INFO:root:Successfully completed job in 6.976028919219971 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:45955
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.8 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.8_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fb11dd6b310> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fb11dd6b3a0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fb11dd6baf0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempxmk9tmca/artifactsshqbgdy3' '--job-port' '42605' '--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:42605.
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:21 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:35495'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:22 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:42191'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:22 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:42605'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:22 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:22 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_fcc9de5c-edf4-4ec3-915c-755d84b54248.'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:22 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_fcc9de5c-edf4-4ec3-915c-755d84b54248.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:22 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_fcc9de5c-edf4-4ec3-915c-755d84b54248.null.'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:22 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_fcc9de5c-edf4-4ec3-915c-755d84b54248.'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:23 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1004194423-1ad9f6ab_e232d001-b2b2-47ca-afec-043f3670b83f'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:23 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1004194423-1ad9f6ab_e232d001-b2b2-47ca-afec-043f3670b83f'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:24 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:25 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:28 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:28 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1004194423-1ad9f6ab_e232d001-b2b2-47ca-afec-043f3670b83f on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:36241.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:40201.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:33679
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:30 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1004194423-1ad9f6ab_e232d001-b2b2-47ca-afec-043f3670b83f: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/10/04 19:44:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1004194423-1ad9f6ab_e232d001-b2b2-47ca-afec-043f3670b83f finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 246, in run
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1033, in pull_responses
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633376672.359075495","description":"Error received from peer ipv4:127.0.0.1:33679","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633376672.359155191","description":"Error received from peer ipv4:127.0.0.1:36241","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633376672.359113506","description":"Error received from peer ipv4:127.0.0.1:40201","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633376672.359075495","description":"Error received from peer ipv4:127.0.0.1:33679","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 143

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py38:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 46m 59s
212 actionable tasks: 155 executed, 53 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/i7a3xkj53szuo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python38 #1774

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1774/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1773

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1773/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015] Remove the overhead of SpecMonitoringInfoValidator

[noreply] Minor: Replace generic external.py links in multi-language documentation


------------------------------------------
[...truncated 3.41 MB...]
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2109: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2395: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2397: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/dataframe/io.py>:569: FutureWarning: WriteToFiles is experimental.
    return pcoll | fileio.WriteToFiles(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2099: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    warnings.warn("`Parse` is deprecated in avro 1.9.2. "

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:178
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:178: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:81: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(

apache_beam/runners/dataflow/ptransform_overrides.py:316
apache_beam/runners/dataflow/ptransform_overrides.py:316
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:316: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    io.BigQuerySink(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:159
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:159: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:122
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:122: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations) | WriteToSpanner(

apache_beam/ml/gcp/cloud_dlp_it_test.py:74
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:74: FutureWarning: MaskDetectedDetails is experimental.
    | MaskDetectedDetails(

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:85
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:85: FutureWarning: InspectForDetails is experimental.
    | InspectForDetails(

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2569: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2570
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2570: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2583
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2583: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:120
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:120: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:108
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:108: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/pytest_postCommitIT-df-py38.xml> -
============ 63 passed, 11 skipped, 189 warnings in 5903.65 seconds ============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 52m 33s
212 actionable tasks: 166 executed, 42 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/kd33p2behq5xu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1772

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1772/display/redirect>

Changes:


------------------------------------------
[...truncated 48.09 MB...]
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:297"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633610046
  nanos: 994677543
}
message: "Renamed 1 shards in 0.09 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:345"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633610047
  nanos: 3046512
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:256"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633610047
  nanos: 3221273
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633610047
  nanos: 3301858
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633610047
  nanos: 3416776
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:902"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633610047
  nanos: 4333734
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633610047
  nanos: 4452943
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 17.164924383163452 seconds.
INFO:root:Successfully completed job in 17.164924383163452 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:46359
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.8 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.8_sdk:2.35.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f70154243a0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f7015424430> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f7015424b80> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.35.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempncdos4gz/artifacts1n143ztx' '--job-port' '55383' '--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:55383.
WARNING:root:Waiting for grpc channel to be ready at localhost:55383.
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:43895'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:19 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:43415'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:19 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:55383'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:19 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:55383.
WARNING:root:Waiting for grpc channel to be ready at localhost:55383.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:22 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_71cc2949-ff3c-4209-ac36-9ba01f0aa2dc.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:22 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_71cc2949-ff3c-4209-ac36-9ba01f0aa2dc.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:22 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_71cc2949-ff3c-4209-ac36-9ba01f0aa2dc.null.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:22 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_71cc2949-ff3c-4209-ac36-9ba01f0aa2dc.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:23 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1007123423-20f7e3bf_6ebfa65e-0ea1-4266-9d4d-64ebc6df68d6'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:23 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1007123423-20f7e3bf_6ebfa65e-0ea1-4266-9d4d-64ebc6df68d6'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:24 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:26 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:29 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:30 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:30 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1007123423-20f7e3bf_6ebfa65e-0ea1-4266-9d4d-64ebc6df68d6 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:42297.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:39727.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36479
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1007123423-20f7e3bf_6ebfa65e-0ea1-4266-9d4d-64ebc6df68d6: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.04 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/10/07 12:34:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1007123423-20f7e3bf_6ebfa65e-0ea1-4266-9d4d-64ebc6df68d6 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1033, in pull_responses
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 246, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633610073.779323123","description":"Error received from peer ipv4:127.0.0.1:42297","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633610073.779266564","description":"Error received from peer ipv4:127.0.0.1:39727","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633610073.779247568","description":"Error received from peer ipv4:127.0.0.1:36479","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633610073.779247568","description":"Error received from peer ipv4:127.0.0.1:36479","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 45m 7s
212 actionable tasks: 157 executed, 51 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/apojyydorpdeu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1771

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1771/display/redirect?page=changes>

Changes:

[david.prieto] [BEAM-12950] Not delete orphaned files to avoid missing events

[david.prieto] [BEAM-12950] Add Bug fix description to CHANGES.md

[david.prieto] [BEAM-12950] fix linter issues

[david.prieto] [BEAN-12950] Skip unit test


------------------------------------------
[...truncated 55.13 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Metrics registered'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Context created'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: According to the connector configuration data will be snapshotted'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 1 - Preparing'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 2 - Determining captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 3 - Locking captured tables []'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 4 - Determining snapshot offset'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating initial offset context'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2082248}' from transaction '1368'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2082248}' from transaction '1368'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 5 - Reading structure of captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 6 - Persisting schema history'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 7 - Snapshotting data'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshotting contents of 0 tables while still in transaction'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot - Final stage'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, sourceInfo=source_info[server='dbserver1'db='inventory', lsn=LSN{0/2082248}, txId=1368, timestamp=2021-10-07T06:27:24.094Z, snapshot=FALSE], lastSnapshotRecord=true, lastCompletelyProcessedLsn=null, lastCommitLsn=null, streamingStoppingLsn=null, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], incrementalSnapshotContext=IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Connected metrics set to 'true'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting streaming'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.spatial_ref_sys' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.geom' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products_on_hand' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.customers' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.orders' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No incremental snapshot in progress, no action needed on start'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Retrieved latest position from stored offset 'LSN{0/2082248}'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{0/2082248}'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{0/2038920}, catalogXmin=597]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.spatial_ref_sys' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.geom' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products_on_hand' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.customers' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.orders' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Searching for WAL resume position'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: WAL resume position 'null' discovered"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Processing messages'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Finished streaming'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Connected metrics set to 'false'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1301752466'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 36658'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No previous offsets found'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: user 'debezium' connected to database 'inventory' on PostgreSQL 11.13 (Debian 11.13-1.pgdg90+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 6.3.0-18+deb9u1) 6.3.0 20170516, 64-bit with roles:"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_all_settings' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_stat_scan_tables' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_write_server_files' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'debezium' [superuser: true, replication: true, inherit: true, create role: true, create db: true, can log in: true]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_monitor' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_server_files' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_execute_server_program' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_all_stats' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_signal_backend' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{0/2038920}, catalogXmin=597]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No previous offset found'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Metrics registered'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Context created'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: According to the connector configuration data will be snapshotted'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 1 - Preparing'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 2 - Determining captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 3 - Locking captured tables []'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 4 - Determining snapshot offset'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating initial offset context'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2082270}' from transaction '1369'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2082270}' from transaction '1369'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 5 - Reading structure of captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 6 - Persisting schema history'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 7 - Snapshotting data'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshotting contents of 0 tables while still in transaction'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot - Final stage'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, sourceInfo=source_info[server='dbserver1'db='inventory', lsn=LSN{0/2082270}, txId=1369, timestamp=2021-10-07T06:27:24.807Z, snapshot=FALSE], lastSnapshotRecord=true, lastCompletelyProcessedLsn=null, lastCommitLsn=null, streamingStoppingLsn=null, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], incrementalSnapshotContext=IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Connected metrics set to 'true'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting streaming'

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT

[gw0] PASSED apache_beam/examples/fastavro_it_test.py::FastavroIT::test_avro_it 
apache_beam/examples/complete/game/user_score_it_test.py::UserScoreIT::test_user_score_it 
> Task :sdks:python:test-suites:portable:py38:postCommitPy38IT
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.spatial_ref_sys' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.geom' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products_on_hand' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.customers' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.orders' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No incremental snapshot in progress, no action needed on start'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Retrieved latest position from stored offset 'LSN{0/2082270}'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{0/2082270}'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{0/2038920}, catalogXmin=597]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.spatial_ref_sys' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.geom' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products_on_hand' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.customers' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.orders' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"

> Task :sdks:python:test-suites:portable:py38:postCommitPy38IT
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.spatial_ref_sys' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.geom' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products_on_hand' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.customers' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 07, 2021 6:27:24 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
java.lang.OutOfMemoryError: GC overhead limit exceeded
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python38 - Build # 1770 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python38 - Build # 1770 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python38/1770/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python38 #1769

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1769/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12909][BEAM-12849]  Add support for running spark3 nexmark queries


------------------------------------------
[...truncated 67.18 MB...]
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2109: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2395: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2397: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/dataframe/io.py>:569: FutureWarning: WriteToFiles is experimental.
    return pcoll | fileio.WriteToFiles(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2099: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    warnings.warn("`Parse` is deprecated in avro 1.9.2. "

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/ml/gcp/cloud_dlp_it_test.py:74
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:74: FutureWarning: MaskDetectedDetails is experimental.
    | MaskDetectedDetails(

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:81: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(

apache_beam/runners/dataflow/ptransform_overrides.py:316
apache_beam/runners/dataflow/ptransform_overrides.py:316
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:316: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    io.BigQuerySink(

apache_beam/ml/gcp/cloud_dlp_it_test.py:85
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:85: FutureWarning: InspectForDetails is experimental.
    | InspectForDetails(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:178
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:178: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:159
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:159: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:122
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:122: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations) | WriteToSpanner(

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2569: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2570
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2570: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2583
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2583: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:120
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:120: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:108
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:108: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/pytest_postCommitIT-df-py38.xml> -
============ 63 passed, 11 skipped, 191 warnings in 5885.11 seconds ============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 45m 15s
212 actionable tasks: 167 executed, 41 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/na6pln7mz6lo6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1768

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1768/display/redirect>

Changes:


------------------------------------------
[...truncated 68.16 MB...]
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:297"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523212
  nanos: 54699420
}
message: "Renamed 1 shards in 0.04 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:345"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523212
  nanos: 57889223
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:256"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523212
  nanos: 58022737
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523212
  nanos: 58086872
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523212
  nanos: 58143854
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:902"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523212
  nanos: 58866977
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633523212
  nanos: 58957338
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 4.879696369171143 seconds.
INFO:root:Successfully completed job in 4.879696369171143 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:45751
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.8 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.8_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f6821243430> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f68212434c0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f6821243c10> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempc_fnzhf3/artifacts2mpl7sqp' '--job-port' '55121' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:56 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:45917'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:56 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:44313'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:56 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:55121'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:56 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:57 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_f0ae5131-89b8-4461-975d-9cfd509c8cdd.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:57 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_f0ae5131-89b8-4461-975d-9cfd509c8cdd.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:57 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_f0ae5131-89b8-4461-975d-9cfd509c8cdd.null.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:57 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_f0ae5131-89b8-4461-975d-9cfd509c8cdd.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:57 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1006122657-640bb4d6_6f5ea1ab-e18c-40fc-87c3-29cc06ffc084'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:57 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1006122657-640bb4d6_6f5ea1ab-e18c-40fc-87c3-29cc06ffc084'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:58 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:58 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b"21/10/06 12:26:59 WARN org.apache.spark.util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041."
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:59 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:59 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:26:59 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1006122657-640bb4d6_6f5ea1ab-e18c-40fc-87c3-29cc06ffc084 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:42573.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:40285.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:45865
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1006122657-640bb4d6_6f5ea1ab-e18c-40fc-87c3-29cc06ffc084: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/10/06 12:27:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1006122657-640bb4d6_6f5ea1ab-e18c-40fc-87c3-29cc06ffc084 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1033, in pull_responses
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 246, in run
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633523221.769375880","description":"Error received from peer ipv4:127.0.0.1:40285","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633523221.769349438","description":"Error received from peer ipv4:127.0.0.1:45865","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633523221.769377661","description":"Error received from peer ipv4:127.0.0.1:42573","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633523221.769349438","description":"Error received from peer ipv4:127.0.0.1:45865","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 54m 21s
212 actionable tasks: 148 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/fw3rzjet56ism

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1767

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1767/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15602 from [BEAM-10917] Add support for BigQuery


------------------------------------------
[...truncated 69.56 MB...]
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
apache_beam/io/gcp/bigquery.py:2109
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2109: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
apache_beam/io/gcp/bigquery.py:2395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2395: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
apache_beam/io/gcp/bigquery.py:2397
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2397: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
apache_beam/io/gcp/bigquery.py:2421
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/dataframe/io.py>:569: FutureWarning: WriteToFiles is experimental.
    return pcoll | fileio.WriteToFiles(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
apache_beam/io/gcp/bigquery.py:2099
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2099: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    warnings.warn("`Parse` is deprecated in avro 1.9.2. "

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/ml/gcp/cloud_dlp_it_test.py:74
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:74: FutureWarning: MaskDetectedDetails is experimental.
    | MaskDetectedDetails(

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:81: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(

apache_beam/runners/dataflow/ptransform_overrides.py:316
apache_beam/runners/dataflow/ptransform_overrides.py:316
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:316: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    io.BigQuerySink(

apache_beam/ml/gcp/cloud_dlp_it_test.py:85
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:85: FutureWarning: InspectForDetails is experimental.
    | InspectForDetails(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:178
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:178: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:159
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:159: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2569: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2570
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2570: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2583
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2583: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:122
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:122: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations) | WriteToSpanner(

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:120
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:120: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:108
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:108: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/pytest_postCommitIT-df-py38.xml> -
============ 63 passed, 11 skipped, 185 warnings in 5764.53 seconds ============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 43m 6s
212 actionable tasks: 151 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/d2owlm5ef6sso

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1766

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1766/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15614 from [BEAM-12953] [Playground] Create protobuf

[noreply] [BEAM-11831] Parially Revert "[BEAM-11805] Replace user-agent for

[kawaigin] [BEAM-10708] Enable submit beam_sql built jobs to Dataflow


------------------------------------------
[...truncated 37.39 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1242728333'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 48509'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"SEVERE: -------- Error on consumer: Couldn't obtain encoding for database inventory. with stacktrace: [io.debezium.connector.postgresql.connection.PostgresConnection.getDatabaseCharset(PostgresConnection.java:469), io.debezium.connector.postgresql.PostgresConnectorTask.start(PostgresConnectorTask.java:76), io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:133), org.apache.beam.io.debezium.KafkaSourceConsumerFn.process(KafkaSourceConsumerFn.java:161), org.apache.beam.io.debezium.KafkaSourceConsumerFn$DoFnInvoker.invokeProcessElement(Unknown Source), org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1065), org.apache.beam.fn.harness.FnApiDoFnRunner.access$1000(FnApiDoFnRunner.java:144), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:645), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:640), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:266), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:218), org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:221), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:43), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:25), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient$ConsumerAndData.accept(QueueingBeamFnDataClient.java:316), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:219), org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:351), org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:151), org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:116), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624), java.lang.Thread.run(Thread.java:748)]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1242728333'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 48509'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"SEVERE: -------- Error on consumer: Couldn't obtain encoding for database inventory. with stacktrace: [io.debezium.connector.postgresql.connection.PostgresConnection.getDatabaseCharset(PostgresConnection.java:469), io.debezium.connector.postgresql.PostgresConnectorTask.start(PostgresConnectorTask.java:76), io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:133), org.apache.beam.io.debezium.KafkaSourceConsumerFn.process(KafkaSourceConsumerFn.java:161), org.apache.beam.io.debezium.KafkaSourceConsumerFn$DoFnInvoker.invokeProcessElement(Unknown Source), org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1065), org.apache.beam.fn.harness.FnApiDoFnRunner.access$1000(FnApiDoFnRunner.java:144), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:645), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:640), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:266), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:218), org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:221), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:43), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:25), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient$ConsumerAndData.accept(QueueingBeamFnDataClient.java:316), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:219), org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:351), org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:151), org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:116), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624), java.lang.Thread.run(Thread.java:748)]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1242728333'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 48509'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"SEVERE: -------- Error on consumer: Couldn't obtain encoding for database inventory. with stacktrace: [io.debezium.connector.postgresql.connection.PostgresConnection.getDatabaseCharset(PostgresConnection.java:469), io.debezium.connector.postgresql.PostgresConnectorTask.start(PostgresConnectorTask.java:76), io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:133), org.apache.beam.io.debezium.KafkaSourceConsumerFn.process(KafkaSourceConsumerFn.java:161), org.apache.beam.io.debezium.KafkaSourceConsumerFn$DoFnInvoker.invokeProcessElement(Unknown Source), org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1065), org.apache.beam.fn.harness.FnApiDoFnRunner.access$1000(FnApiDoFnRunner.java:144), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:645), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:640), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:266), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:218), org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:221), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:43), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:25), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient$ConsumerAndData.accept(QueueingBeamFnDataClient.java:316), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:219), org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:351), org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:151), org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:116), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624), java.lang.Thread.run(Thread.java:748)]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1242728333'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 48509'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"SEVERE: -------- Error on consumer: Couldn't obtain encoding for database inventory. with stacktrace: [io.debezium.connector.postgresql.connection.PostgresConnection.getDatabaseCharset(PostgresConnection.java:469), io.debezium.connector.postgresql.PostgresConnectorTask.start(PostgresConnectorTask.java:76), io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:133), org.apache.beam.io.debezium.KafkaSourceConsumerFn.process(KafkaSourceConsumerFn.java:161), org.apache.beam.io.debezium.KafkaSourceConsumerFn$DoFnInvoker.invokeProcessElement(Unknown Source), org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1065), org.apache.beam.fn.harness.FnApiDoFnRunner.access$1000(FnApiDoFnRunner.java:144), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:645), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:640), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:266), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:218), org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:221), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:43), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:25), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient$ConsumerAndData.accept(QueueingBeamFnDataClient.java:316), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:219), org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:351), org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:151), org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:116), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624), java.lang.Thread.run(Thread.java:748)]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1242728333'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 48509'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"SEVERE: -------- Error on consumer: Couldn't obtain encoding for database inventory. with stacktrace: [io.debezium.connector.postgresql.connection.PostgresConnection.getDatabaseCharset(PostgresConnection.java:469), io.debezium.connector.postgresql.PostgresConnectorTask.start(PostgresConnectorTask.java:76), io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:133), org.apache.beam.io.debezium.KafkaSourceConsumerFn.process(KafkaSourceConsumerFn.java:161), org.apache.beam.io.debezium.KafkaSourceConsumerFn$DoFnInvoker.invokeProcessElement(Unknown Source), org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1065), org.apache.beam.fn.harness.FnApiDoFnRunner.access$1000(FnApiDoFnRunner.java:144), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:645), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:640), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:266), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:218), org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:221), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:43), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:25), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient$ConsumerAndData.accept(QueueingBeamFnDataClient.java:316), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:219), org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:351), org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:151), org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:116), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624), java.lang.Thread.run(Thread.java:748)]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1242728333'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 48509'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"SEVERE: -------- Error on consumer: Couldn't obtain encoding for database inventory. with stacktrace: [io.debezium.connector.postgresql.connection.PostgresConnection.getDatabaseCharset(PostgresConnection.java:469), io.debezium.connector.postgresql.PostgresConnectorTask.start(PostgresConnectorTask.java:76), io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:133), org.apache.beam.io.debezium.KafkaSourceConsumerFn.process(KafkaSourceConsumerFn.java:161), org.apache.beam.io.debezium.KafkaSourceConsumerFn$DoFnInvoker.invokeProcessElement(Unknown Source), org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1065), org.apache.beam.fn.harness.FnApiDoFnRunner.access$1000(FnApiDoFnRunner.java:144), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:645), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:640), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:266), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:218), org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:221), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:43), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:25), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient$ConsumerAndData.accept(QueueingBeamFnDataClient.java:316), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:219), org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:351), org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:151), org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:116), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624), java.lang.Thread.run(Thread.java:748)]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1242728333'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 48509'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"SEVERE: -------- Error on consumer: Couldn't obtain encoding for database inventory. with stacktrace: [io.debezium.connector.postgresql.connection.PostgresConnection.getDatabaseCharset(PostgresConnection.java:469), io.debezium.connector.postgresql.PostgresConnectorTask.start(PostgresConnectorTask.java:76), io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:133), org.apache.beam.io.debezium.KafkaSourceConsumerFn.process(KafkaSourceConsumerFn.java:161), org.apache.beam.io.debezium.KafkaSourceConsumerFn$DoFnInvoker.invokeProcessElement(Unknown Source), org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1065), org.apache.beam.fn.harness.FnApiDoFnRunner.access$1000(FnApiDoFnRunner.java:144), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:645), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:640), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:266), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:218), org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:221), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:43), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:25), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient$ConsumerAndData.accept(QueueingBeamFnDataClient.java:316), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:219), org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:351), org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:151), org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:116), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624), java.lang.Thread.run(Thread.java:748)]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1242728333'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 48509'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"SEVERE: -------- Error on consumer: Couldn't obtain encoding for database inventory. with stacktrace: [io.debezium.connector.postgresql.connection.PostgresConnection.getDatabaseCharset(PostgresConnection.java:469), io.debezium.connector.postgresql.PostgresConnectorTask.start(PostgresConnectorTask.java:76), io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:133), org.apache.beam.io.debezium.KafkaSourceConsumerFn.process(KafkaSourceConsumerFn.java:161), org.apache.beam.io.debezium.KafkaSourceConsumerFn$DoFnInvoker.invokeProcessElement(Unknown Source), org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1065), org.apache.beam.fn.harness.FnApiDoFnRunner.access$1000(FnApiDoFnRunner.java:144), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:645), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:640), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:266), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:218), org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:221), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:43), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:25), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient$ConsumerAndData.accept(QueueingBeamFnDataClient.java:316), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:219), org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:351), org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:151), org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:116), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624), java.lang.Thread.run(Thread.java:748)]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 1242728333'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 48509'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"SEVERE: -------- Error on consumer: Couldn't obtain encoding for database inventory. with stacktrace: [io.debezium.connector.postgresql.connection.PostgresConnection.getDatabaseCharset(PostgresConnection.java:469), io.debezium.connector.postgresql.PostgresConnectorTask.start(PostgresConnectorTask.java:76), io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:133), org.apache.beam.io.debezium.KafkaSourceConsumerFn.process(KafkaSourceConsumerFn.java:161), org.apache.beam.io.debezium.KafkaSourceConsumerFn$DoFnInvoker.invokeProcessElement(Unknown Source), org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1065), org.apache.beam.fn.harness.FnApiDoFnRunner.access$1000(FnApiDoFnRunner.java:144), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:645), org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:640), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:266), org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:218), org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:221), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:43), org.apache.beam.sdk.fn.data.DecodingFnDataReceiver.accept(DecodingFnDataReceiver.java:25), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient$ConsumerAndData.accept(QueueingBeamFnDataClient.java:316), org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.drainAndBlock(QueueingBeamFnDataClient.java:219), org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:351), org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:151), org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:116), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624), java.lang.Thread.run(Thread.java:748)]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 06, 2021 12:41:34 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1765

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1765/display/redirect?page=changes>

Changes:

[noreply] Only stage the single (fat) jar when auto-starting expansion service.

[noreply] Disable samza counters (#15659)


------------------------------------------
[...truncated 30.96 MB...]
message: "Creating client data channel for localhost:41819"
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:750"
thread: "Thread-14"

INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Impulse_15)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2965-_16))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Map-decode-_18))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:root:Running (((((ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Impulse_15)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2965-_16))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Map-decode-_18))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1633458398
  nanos: 698507547
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:297"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633458398
  nanos: 717587471
}
message: "Renamed 1 shards in 0.02 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:345"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1633458398
  nanos: 723153829
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:256"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633458398
  nanos: 723319292
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633458398
  nanos: 723425865
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633458398
  nanos: 723504543
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:902"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633458398
  nanos: 724567651
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1633458398
  nanos: 724731445
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 9.199021816253662 seconds.
INFO:root:Successfully completed job in 9.199021816253662 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:34209
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.8 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.8_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f78a77a73a0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f78a77a7430> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f78a77a7b80> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temps0dbd7lp/artifactsvwrsc6dj' '--job-port' '36419' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:44 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:45487'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:44 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:33621'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:44 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:36419'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:44 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:45 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_aa04762c-bb86-4a45-ac3f-607a46de0424.'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:45 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_aa04762c-bb86-4a45-ac3f-607a46de0424.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:45 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_aa04762c-bb86-4a45-ac3f-607a46de0424.null.'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:45 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_aa04762c-bb86-4a45-ac3f-607a46de0424.'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:45 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1005182645-1c4ad21_4901a453-ab08-4895-ad84-62f21ad5c23d'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:46 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1005182645-1c4ad21_4901a453-ab08-4895-ad84-62f21ad5c23d'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:46 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:46 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:47 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:47 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1005182645-1c4ad21_4901a453-ab08-4895-ad84-62f21ad5c23d on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:39117.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:40659.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:33861
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1005182645-1c4ad21_4901a453-ab08-4895-ad84-62f21ad5c23d: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/10/05 18:26:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1005182645-1c4ad21_4901a453-ab08-4895-ad84-62f21ad5c23d finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 246, in run
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1033, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633458410.730201256","description":"Error received from peer ipv4:127.0.0.1:40659","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633458410.730185771","description":"Error received from peer ipv4:127.0.0.1:33861","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1633458410.730203121","description":"Error received from peer ipv4:127.0.0.1:39117","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1633458410.730185771","description":"Error received from peer ipv4:127.0.0.1:33861","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 44m 33s
212 actionable tasks: 157 executed, 51 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/d44vli4gu5mh2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1764

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1764/display/redirect>

Changes:


------------------------------------------
[...truncated 42.81 MB...]
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1712: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1954: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1956: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1980: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/dataframe/io.py>:569: FutureWarning: WriteToFiles is experimental.
    return pcoll | fileio.WriteToFiles(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1702: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    warnings.warn("`Parse` is deprecated in avro 1.9.2. "

apache_beam/io/gcp/bigquery_test.py:1123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:178
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:178: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/io/gcp/bigquery_read_it_test.py:165
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:165: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:81: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(

apache_beam/runners/dataflow/ptransform_overrides.py:316
apache_beam/runners/dataflow/ptransform_overrides.py:316
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:316: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    io.BigQuerySink(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:159
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:159: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/ml/gcp/cloud_dlp_it_test.py:74
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:74: FutureWarning: MaskDetectedDetails is experimental.
    | MaskDetectedDetails(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:122
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:122: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations) | WriteToSpanner(

apache_beam/io/gcp/bigquery_read_it_test.py:281
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:281: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:85
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:85: FutureWarning: InspectForDetails is experimental.
    | InspectForDetails(

apache_beam/io/gcp/bigquery_read_it_test.py:395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:395: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2088
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2088: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2089
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2089: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2102
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2102: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:120
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:120: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:108
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:108: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/pytest_postCommitIT-df-py38.xml> -
============ 63 passed, 11 skipped, 191 warnings in 6100.29 seconds ============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 53m 1s
212 actionable tasks: 151 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/viui5cxpbciwq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1763

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1763/display/redirect?page=changes>

Changes:

[danthev] Switch hintNumWorkers to ValueProvider, switch firstInstant to side

[noreply] [BEAM-13000] Disable Reshuffle Translation in Samza Portable Mode


------------------------------------------
[...truncated 57.16 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 36490'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No previous offsets found'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: user 'debezium' connected to database 'inventory' on PostgreSQL 11.13 (Debian 11.13-1.pgdg90+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 6.3.0-18+deb9u1) 6.3.0 20170516, 64-bit with roles:"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_all_settings' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_stat_scan_tables' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_write_server_files' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'debezium' [superuser: true, replication: true, inherit: true, create role: true, create db: true, can log in: true]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_monitor' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_server_files' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_execute_server_program' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_all_stats' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_signal_backend' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{0/203DCE0}, catalogXmin=597]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No previous offset found'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Metrics registered'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Context created'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: According to the connector configuration data will be snapshotted'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 1 - Preparing'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 2 - Determining captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 3 - Locking captured tables []'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 4 - Determining snapshot offset'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating initial offset context'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2083E50}' from transaction '1028'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2083E50}' from transaction '1028'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 5 - Reading structure of captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 6 - Persisting schema history'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 7 - Snapshotting data'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshotting contents of 0 tables while still in transaction'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot - Final stage'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, sourceInfo=source_info[server='dbserver1'db='inventory', lsn=LSN{0/2083E50}, txId=1028, timestamp=2021-10-05T06:27:21.631Z, snapshot=FALSE], lastSnapshotRecord=true, lastCompletelyProcessedLsn=null, lastCommitLsn=null, streamingStoppingLsn=null, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], incrementalSnapshotContext=IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Connected metrics set to 'true'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting streaming'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.spatial_ref_sys' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.geom' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products_on_hand' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.customers' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.orders' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No incremental snapshot in progress, no action needed on start'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Retrieved latest position from stored offset 'LSN{0/2083E50}'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{0/2083E50}'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{0/203DCE0}, catalogXmin=597]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.spatial_ref_sys' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.geom' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products_on_hand' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.customers' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.orders' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:21 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Searching for WAL resume position'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Stopping down connector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: WAL resume position 'null' discovered"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Processing messages'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Finished streaming'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Connected metrics set to 'false'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting PostgresConnectorTask with configuration:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    connector.class = io.debezium.connector.postgresql.PostgresConnector'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.dbname = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.user = debezium'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.hostname = localhost'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.password = ********'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    beam.parent.instance = 272205475'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.server.name = dbserver1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.history = org.apache.beam.io.debezium.KafkaSourceConsumerFn$DebeziumSDFDatabaseHistory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    include.schema.changes = false'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.include.list = inventory'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:    database.port = 36490'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No previous offsets found'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: user 'debezium' connected to database 'inventory' on PostgreSQL 11.13 (Debian 11.13-1.pgdg90+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 6.3.0-18+deb9u1) 6.3.0 20170516, 64-bit with roles:"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_all_settings' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_stat_scan_tables' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_write_server_files' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'debezium' [superuser: true, replication: true, inherit: true, create role: true, create db: true, can log in: true]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_monitor' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_server_files' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_execute_server_program' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_read_all_stats' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"\trole 'pg_signal_backend' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{0/203DCE0}, catalogXmin=597]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No previous offset found'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-change-event-source-coordinator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Metrics registered'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Context created'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Taking initial snapshot for new datasource'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: According to the connector configuration data will be snapshotted'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 1 - Preparing'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 2 - Determining captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 3 - Locking captured tables []'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 4 - Determining snapshot offset'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating initial offset context'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2083E78}' from transaction '1029'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Read xlogStart at 'LSN{0/2083E78}' from transaction '1029'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 5 - Reading structure of captured tables'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 6 - Persisting schema history'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot step 7 - Snapshotting data'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshotting contents of 0 tables while still in transaction'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Snapshot - Final stage'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, sourceInfo=source_info[server='dbserver1'db='inventory', lsn=LSN{0/2083E78}, txId=1029, timestamp=2021-10-05T06:27:22.396Z, snapshot=FALSE], lastSnapshotRecord=true, lastCompletelyProcessedLsn=null, lastCommitLsn=null, streamingStoppingLsn=null, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], incrementalSnapshotContext=IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]]"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Connected metrics set to 'true'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Starting streaming'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.spatial_ref_sys' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.geom' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products_on_hand' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.customers' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.orders' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: No incremental snapshot in progress, no action needed on start'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Retrieved latest position from stored offset 'LSN{0/2083E78}'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{0/2083E78}'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{0/203DCE0}, catalogXmin=597]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Connection gracefully closed'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Requested thread factory for connector PostgresConnector, id = dbserver1 named = keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO: Creating thread debezium-postgresconnector-dbserver1-keep-alive'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.spatial_ref_sys' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.geom' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.products_on_hand' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.customers' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO: REPLICA IDENTITY for 'inventory.orders' is 'FULL'; UPDATE AND DELETE events will contain the previous values of all the columns"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 05, 2021 6:27:22 AM org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter log'
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1762

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1762/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15618 from [BEAM-12968] [Playground] Create README

[noreply] Merge pull request #15626 from [BEAM-12963] [Playground] Create base

[noreply] Merge pull request #15566 from [BEAM-12925] Correct behavior retrieving

[noreply] [BEAM-12911] Update schema translation (Java, Python) to log more

[noreply] Update CHANGES.md for JdbcIO breaking change (#15651)

[noreply] [BEAM-12513] Add Go SDK metrics content to BPG. (#15650)

[noreply] [BEAM-12996] Improve Error Logging in ConfigBuilder (#15646)


------------------------------------------
[...truncated 49.35 MB...]
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1712: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1954: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1956: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
apache_beam/io/gcp/bigquery.py:1980
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1980: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/dataframe/io.py>:569: FutureWarning: WriteToFiles is experimental.
    return pcoll | fileio.WriteToFiles(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1702: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/avro/schema.py>:1249: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    warnings.warn("`Parse` is deprecated in avro 1.9.2. "

apache_beam/io/gcp/bigquery_test.py:1123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:165
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:165: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:74
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:74: FutureWarning: MaskDetectedDetails is experimental.
    | MaskDetectedDetails(

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:81: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(

apache_beam/runners/dataflow/ptransform_overrides.py:316
apache_beam/runners/dataflow/ptransform_overrides.py:316
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:316: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    io.BigQuerySink(

apache_beam/ml/gcp/cloud_dlp_it_test.py:85
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:85: FutureWarning: InspectForDetails is experimental.
    | InspectForDetails(

apache_beam/io/gcp/bigquery_read_it_test.py:281
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:281: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:178
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:178: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:159
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:159: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations_update) | WriteToSpanner(

apache_beam/io/gcp/bigquery_read_it_test.py:395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:395: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2088
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2088: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2089
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2089: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2102
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2102: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:122
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:122: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    p | beam.Create(mutations) | WriteToSpanner(

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:120
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:120: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:108
  <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:108: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    r = p | ReadFromSpanner(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/pytest_postCommitIT-df-py38.xml> -
============ 63 passed, 11 skipped, 189 warnings in 5855.88 seconds ============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 40m 4s
212 actionable tasks: 155 executed, 53 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/eaje2zrufwjhq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org