You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/09/12 06:57:11 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #4262

See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4262/display/redirect>

Changes:


------------------------------------------
[...truncated 21.64 MB...]
INFO:root:Running (((((ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Impulse_15)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2965-_16))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Map-decode-_18))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1631427417
  nanos: 458928823
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631427417
  nanos: 575971603
}
message: "Renamed 1 shards in 0.12 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631427417
  nanos: 603681564
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631427417
  nanos: 603984355
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631427417
  nanos: 604095935
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:790"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631427417
  nanos: 604186534
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:907"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631427417
  nanos: 606585264
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631427417
  nanos: 606864452
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 5.336040258407593 seconds.
INFO:root:Successfully completed job in 5.336040258407593 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:46373
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f3ed7df72f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f3ed7df7378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f3ed7df7a60> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempvw2yr4h0/artifactswz4d_prv' '--job-port' '52755' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:02 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:37249'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:02 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:41211'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:02 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:52755'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:02 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:03 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_47a64e90-1010-49f4-848e-6cb48ab30810.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:03 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_47a64e90-1010-49f4-848e-6cb48ab30810.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:03 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_47a64e90-1010-49f4-848e-6cb48ab30810.null.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:03 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_47a64e90-1010-49f4-848e-6cb48ab30810.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:03 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-0912061703-c1ece038_5a636580-1ac0-4c76-af11-b449b2dca81d'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:03 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-0912061703-c1ece038_5a636580-1ac0-4c76-af11-b449b2dca81d'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:03 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:04 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:05 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:05 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-0912061703-c1ece038_5a636580-1ac0-4c76-af11-b449b2dca81d on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:45451.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:35939.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:38897
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0912061703-c1ece038_5a636580-1ac0-4c76-af11-b449b2dca81d: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/09/12 06:17:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0912061703-c1ece038_5a636580-1ac0-4c76-af11-b449b2dca81d finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631427427.908466264","description":"Error received from peer ipv4:127.0.0.1:38897","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 659, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631427427.908466264","description":"Error received from peer ipv4:127.0.0.1:38897","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1038, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631427427.908491058","description":"Error received from peer ipv4:127.0.0.1:35939","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 251, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1631427427.908511128","description":"Error received from peer ipv4:127.0.0.1:45451","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 34s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/e56a3gg7hu5uu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #4270

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4270/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4269

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4269/display/redirect?page=changes>

Changes:

[noreply] Avoid apiary submission of job graph when it is not needed. (#15458)

[noreply] [BEAM-7261] Add support for BasicSessionCredentials for AWS credentials.


------------------------------------------
[...truncated 636.85 KB...]
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1631579110
  nanos: 62710762
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631579110
  nanos: 193339586
}
message: "Renamed 1 shards in 0.13 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631579110
  nanos: 208331346
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631579110
  nanos: 208565950
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631579110
  nanos: 208664417
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:790"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631579110
  nanos: 208753824
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:907"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631579110
  nanos: 209965705
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631579110
  nanos: 210144281
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 7.568809270858765 seconds.
INFO:root:Successfully completed job in 7.568809270858765 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:33431
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7effd4b4a510> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7effd4b4a598> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7effd4b4ac80> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temph0hpqcuh/artifactsdb531uf_' '--job-port' '33173' '--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:33173.
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:44461'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:33721'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:33173'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:33173.
WARNING:root:Waiting for grpc channel to be ready at localhost:33173.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:21 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_457a6bb4-2509-4d34-a252-9ca3d0a313d2.'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:21 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_457a6bb4-2509-4d34-a252-9ca3d0a313d2.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:21 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_457a6bb4-2509-4d34-a252-9ca3d0a313d2.null.'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:22 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_457a6bb4-2509-4d34-a252-9ca3d0a313d2.'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:23 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-0914002523-26aec5c3_df59a5b4-c3ce-4342-8574-f68cbbedf22b'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:23 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-0914002523-26aec5c3_df59a5b4-c3ce-4342-8574-f68cbbedf22b'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:24 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:25 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b"21/09/14 00:25:28 WARN org.apache.spark.util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041."
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:29 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:29 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-0914002523-26aec5c3_df59a5b4-c3ce-4342-8574-f68cbbedf22b on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:43439.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:35757.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:38375
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:32 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:34 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0914002523-26aec5c3_df59a5b4-c3ce-4342-8574-f68cbbedf22b: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/09/14 00:25:34 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0914002523-26aec5c3_df59a5b4-c3ce-4342-8574-f68cbbedf22b finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1631579135.095827827","description":"Error received from peer ipv4:127.0.0.1:38375","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1038, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631579135.095872199","description":"Error received from peer ipv4:127.0.0.1:35757","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 659, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1631579135.095827827","description":"Error received from peer ipv4:127.0.0.1:38375","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 251, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631579135.095885693","description":"Error received from peer ipv4:127.0.0.1:43439","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 43m 51s
214 actionable tasks: 152 executed, 58 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/v64jjpxv7byz2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4268

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4268/display/redirect?page=changes>

Changes:

[rohde.samuel] [BEAM-12842] Add timestamp to test work item to deflake

[suztomo] [BEAM-12873] HL7v2IO: to leave schematizedData null, not empty


------------------------------------------
[...truncated 30.45 MB...]
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1631557673
  nanos: 799919128
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631557673
  nanos: 962991952
}
message: "Renamed 1 shards in 0.16 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631557673
  nanos: 975738286
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631557673
  nanos: 975980520
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631557673
  nanos: 976086616
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:790"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631557673
  nanos: 976180791
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:907"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631557673
  nanos: 976701736
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631557673
  nanos: 976841926
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

ce8c456807af4629561bde0f6963174efae14cd970e5146eb3c89403f19bcb1e
INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 7.166557312011719 seconds.
INFO:root:Successfully completed job in 7.166557312011719 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:32821
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f52cc25e2f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f52cc25e378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f52cc25ea60> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempzqp9jwqj/artifactsr9m8hef5' '--job-port' '54485' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:27:59 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:33341'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:43817'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:54485'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:54485.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:02 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_649b6419-f751-4f92-bdb2-e6b74d5ce2ec.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:02 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_649b6419-f751-4f92-bdb2-e6b74d5ce2ec.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:02 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_649b6419-f751-4f92-bdb2-e6b74d5ce2ec.null.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:02 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_649b6419-f751-4f92-bdb2-e6b74d5ce2ec.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:03 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-0913182803-7e598ccf_54e33acf-e2ca-479e-ac28-6dd53c8ced99'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:03 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-0913182803-7e598ccf_54e33acf-e2ca-479e-ac28-6dd53c8ced99'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:03 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:03 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b"21/09/13 18:28:04 WARN org.apache.spark.util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041."
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:04 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:04 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-0913182803-7e598ccf_54e33acf-e2ca-479e-ac28-6dd53c8ced99 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:36525.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:40713.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:39935
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0913182803-7e598ccf_54e33acf-e2ca-479e-ac28-6dd53c8ced99: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/09/13 18:28:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0913182803-7e598ccf_54e33acf-e2ca-479e-ac28-6dd53c8ced99 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1038, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631557687.928049668","description":"Error received from peer ipv4:127.0.0.1:40713","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 251, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631557687.928067921","description":"Error received from peer ipv4:127.0.0.1:36525","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631557687.928044975","description":"Error received from peer ipv4:127.0.0.1:39935","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 659, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631557687.928044975","description":"Error received from peer ipv4:127.0.0.1:39935","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 33s
214 actionable tasks: 163 executed, 47 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/rkwpnzm7pbtd2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4267

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4267/display/redirect>

Changes:


------------------------------------------
[...truncated 25.63 MB...]
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2965-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:root:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2965-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1631535649
  nanos: 117650270
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631535649
  nanos: 253380060
}
message: "Renamed 1 shards in 0.14 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631535649
  nanos: 261145353
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631535649
  nanos: 261362075
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631535649
  nanos: 261444568
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:790"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631535649
  nanos: 261527299
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:907"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631535649
  nanos: 263092517
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631535649
  nanos: 263278961
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 5.521745204925537 seconds.
INFO:root:Successfully completed job in 5.521745204925537 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:33391
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fd75ba442f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fd75ba44378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fd75ba44a60> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempf6cnfe2d/artifacts0owparfl' '--job-port' '52337' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:53 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:37005'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:53 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:44843'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:54 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:52337'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:54 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:54 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_6528e661-354a-4264-b155-279f5f5663eb.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:54 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_6528e661-354a-4264-b155-279f5f5663eb.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:54 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_6528e661-354a-4264-b155-279f5f5663eb.null.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:54 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_6528e661-354a-4264-b155-279f5f5663eb.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:55 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-0913122055-1e48bb71_e8b79820-1228-4b46-9839-120ccb2adcd2'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:55 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-0913122055-1e48bb71_e8b79820-1228-4b46-9839-120ccb2adcd2'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:55 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:55 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:56 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:56 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-0913122055-1e48bb71_e8b79820-1228-4b46-9839-120ccb2adcd2 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:45271.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:46095.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:41275
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:59 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0913122055-1e48bb71_e8b79820-1228-4b46-9839-120ccb2adcd2: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/09/13 12:20:59 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0913122055-1e48bb71_e8b79820-1228-4b46-9839-120ccb2adcd2 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1038, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631535659.890082488","description":"Error received from peer ipv4:127.0.0.1:46095","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631535659.890063629","description":"Error received from peer ipv4:127.0.0.1:41275","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 251, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1631535659.890106447","description":"Error received from peer ipv4:127.0.0.1:45271","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 659, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631535659.890063629","description":"Error received from peer ipv4:127.0.0.1:41275","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 20m 27s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/t2uc7khcnkwzg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4266

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4266/display/redirect>

Changes:


------------------------------------------
[...truncated 150.35 KB...]
INFO:root:Running (((((ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Impulse_15)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2965-_16))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Map-decode-_18))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1631513772
  nanos: 841908693
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631513772
  nanos: 965219736
}
message: "Renamed 1 shards in 0.12 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631513772
  nanos: 985746145
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631513772
  nanos: 985986471
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631513772
  nanos: 986076116
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:790"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631513772
  nanos: 986149072
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:907"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631513772
  nanos: 987932443
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631513772
  nanos: 988098621
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 5.305278062820435 seconds.
INFO:root:Successfully completed job in 5.305278062820435 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:45277
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7ff240ef22f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7ff240ef2378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7ff240ef2a60> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temp5jct6sdu/artifactskygnl62o' '--job-port' '37779' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:17 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:38247'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:17 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:38943'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:17 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:37779'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:17 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_3e7417be-d203-4dfa-aa82-e32fe80d5170.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_3e7417be-d203-4dfa-aa82-e32fe80d5170.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_3e7417be-d203-4dfa-aa82-e32fe80d5170.null.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_3e7417be-d203-4dfa-aa82-e32fe80d5170.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:18 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-0913061618-19c32e95_c7ed5def-2e89-4361-8ac3-161fd59ea244'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:18 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-0913061618-19c32e95_c7ed5def-2e89-4361-8ac3-161fd59ea244'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:19 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:19 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:20 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:20 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-0913061618-19c32e95_c7ed5def-2e89-4361-8ac3-161fd59ea244 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:39671.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:39265.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:37959
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:21 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0913061618-19c32e95_c7ed5def-2e89-4361-8ac3-161fd59ea244: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/09/13 06:16:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0913061618-19c32e95_c7ed5def-2e89-4361-8ac3-161fd59ea244 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631513783.144168656","description":"Error received from peer ipv4:127.0.0.1:37959","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1038, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631513783.144204182","description":"Error received from peer ipv4:127.0.0.1:39265","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 251, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631513783.144211181","description":"Error received from peer ipv4:127.0.0.1:39671","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 659, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631513783.144168656","description":"Error received from peer ipv4:127.0.0.1:37959","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 14s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/p5shqjpem7p2k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4265

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4265/display/redirect>

Changes:


------------------------------------------
[...truncated 30.41 MB...]
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1700: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:569: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1937: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1939: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1969: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1690: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_read_it_test.py:165
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:165: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_test.py:1123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:281
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:281: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/bigquery_read_it_test.py:395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:395: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2069
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2069: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2070
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2070: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2083
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2083: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:124: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:113
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:113: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:181
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:181: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:162
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:162: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:126
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:126: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
== 51 failed, 12 passed, 11 skipped, 184 warnings, 1 error in 1597.53 seconds ==

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 30s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/f4fmuyvv34aqm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4264

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4264/display/redirect>

Changes:


------------------------------------------
[...truncated 23.48 MB...]
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2965-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:root:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2965-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1631470633
  nanos: 470688104
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631470633
  nanos: 576162099
}
message: "Renamed 1 shards in 0.11 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1631470633
  nanos: 588909149
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631470633
  nanos: 589090585
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631470633
  nanos: 589172840
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:790"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631470633
  nanos: 589242458
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:907"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631470633
  nanos: 590445518
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1631470633
  nanos: 590605258
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 4.995410680770874 seconds.
INFO:root:Successfully completed job in 4.995410680770874 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:35599
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f155d1e82f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f155d1e8378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f155d1e8a60> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempspdf2wcm/artifactsm56qbfxb' '--job-port' '57109' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:17 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:43975'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:39479'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:57109'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_9662c0e7-5ae6-4973-8dd3-fea4cbe73c28.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_9662c0e7-5ae6-4973-8dd3-fea4cbe73c28.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_9662c0e7-5ae6-4973-8dd3-fea4cbe73c28.null.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:18 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_9662c0e7-5ae6-4973-8dd3-fea4cbe73c28.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:19 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-0912181719-6679729f_3a9214b3-4871-4a3a-85fe-5e6ebf95cdee'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:19 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-0912181719-6679729f_3a9214b3-4871-4a3a-85fe-5e6ebf95cdee'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:19 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:19 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:20 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:20 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-0912181719-6679729f_3a9214b3-4871-4a3a-85fe-5e6ebf95cdee on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:34341.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:42067.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:46567
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:21 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0912181719-6679729f_3a9214b3-4871-4a3a-85fe-5e6ebf95cdee: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/09/12 18:17:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0912181719-6679729f_3a9214b3-4871-4a3a-85fe-5e6ebf95cdee finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631470643.347470211","description":"Error received from peer ipv4:127.0.0.1:46567","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 659, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 642, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631470643.347470211","description":"Error received from peer ipv4:127.0.0.1:46567","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 251, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631470643.347486554","description":"Error received from peer ipv4:127.0.0.1:34341","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1038, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1631470643.347478472","description":"Error received from peer ipv4:127.0.0.1:42067","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 56s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/o3cwqb3ichvpq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4263

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4263/display/redirect>

Changes:


------------------------------------------
[...truncated 7.08 MB...]
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
apache_beam/io/gcp/bigquery.py:1700
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1700: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/avro/schema.py>:1251
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/avro/schema.py>:1251: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    DeprecationWarning)

apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
apache_beam/io/gcp/bigquery.py:1937
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1937: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
apache_beam/io/gcp/bigquery.py:1939
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1939: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
apache_beam/io/gcp/bigquery.py:1969
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1969: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
apache_beam/dataframe/io.py:569
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:569: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
apache_beam/io/gcp/bigquery.py:1690
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1690: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_read_it_test.py:165
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:165: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_test.py:1123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:181
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:181: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/bigquery_read_it_test.py:281
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:281: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:162
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:162: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/bigquery_read_it_test.py:395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:395: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2069
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2069: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2070
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2070: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2083
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2083: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:126
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:126: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:124: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:113
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:113: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
== 48 failed, 15 passed, 11 skipped, 176 warnings, 1 error in 2249.75 seconds ==

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 41m 3s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/pdyphgpxaisc4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org