You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/10/28 03:35:20 UTC

Build failed in Jenkins: beam_PostCommit_Python39 #1025

See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/1025/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-23836] Updating documentation for cross-language Java pipelines

[Heejong Lee] update

[Heejong Lee] update

[Heejong Lee] update

[noreply] Reduce log spam of Py37PostCommit (#23829)

[noreply] Actually use the DatsetService that will be auto-closed (#23873)

[noreply] Migrate BINARY, VARBINARY, CHAR, VARCHAR jdbc logical types to portable

[noreply] [BEAM-12164] Feat: Added SpannerChangeStreamIT to Cloud Spanner Change


------------------------------------------
[...truncated 9.80 MB...]

INFO:root:severity: INFO
timestamp {
  seconds: 1666918458
  nanos: 612725496
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:277"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1666918458
  nanos: 612818956
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker_main.py:188"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Completed job in 18.41614031791687 seconds with state DONE.
INFO:root:Completed job in 18.41614031791687 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py39:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:35949
INFO:root:Default Python SDK image for environment is apache/beam_python3.9_sdk:2.44.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f0fb5d6fe50> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f0fb5d6fee0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f0fb5d70670> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.44.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temppj56rsrx/artifactsttn5z162' '--job-port' '48989' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:24 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:25 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:39367
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:25 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:35305
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:25 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:48989
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:25 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:25 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_836688c1-c85e-4755-ad75-4401c1c3e967.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:25 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_836688c1-c85e-4755-ad75-4401c1c3e967.ref_Environment_default_environment_1.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:25 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_836688c1-c85e-4755-ad75-4401c1c3e967.null.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:25 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_836688c1-c85e-4755-ad75-4401c1c3e967.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:26 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1028005426-43dc91ef_84635f28-b98c-4514-8252-4cdb18418b22
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:26 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1028005426-43dc91ef_84635f28-b98c-4514-8252-4cdb18418b22
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:26 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:26 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:27 INFO org.sparkproject.jetty.util.log: Logging initialized @4987ms to org.sparkproject.jetty.util.log.Slf4jLog
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:27 INFO org.sparkproject.jetty.server.Server: jetty-9.4.40.v20210413; built: 2021-04-13T20:42:42.668Z; git: b881a572662e1943a14ae12e7e1207989f218b74; jvm 1.8.0_342-8u342-b07-0ubuntu1~20.04-b07
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:27 INFO org.sparkproject.jetty.server.Server: Started @5087ms
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:27 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector@10db330e{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25d535e{/jobs,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24aa587e{/jobs/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@754c0793{/jobs/job,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4c8a5c6c{/jobs/job/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4ca8752{/stages,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4f8e8865{/stages/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3bb2f160{/stages/stage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@48bcf5d0{/stages/stage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4008e804{/stages/pool,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@423fbcd5{/stages/pool/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@478614bf{/storage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@32f3401{/storage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@78dd9ce0{/storage/rdd,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@71602cdb{/storage/rdd/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@20105b1c{/environment,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@77206271{/environment/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16682b96{/executors,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@514f0765{/executors/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@207d8dd5{/executors/threadDump,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a992215{/executors/threadDump/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@203f0f65{/static,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@43113b73{/,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a20db28{/api,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1057549d{/jobs/job/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a6dd3a0{/stages/stage/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1ae0aba3{/metrics/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1028005426-43dc91ef_84635f28-b98c-4514-8252-4cdb18418b22 on Spark master local[4]
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:29 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:29 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:29 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:29 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:40951.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:33835.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:38187
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1028005426-43dc91ef_84635f28-b98c-4514-8252-4cdb18418b22: Pipeline translated successfully. Computing outputs
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1028005426-43dc91ef_84635f28-b98c-4514-8252-4cdb18418b22 finished.
INFO:apache_beam.utils.subprocess_server:22/10/28 00:54:31 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@10db330e{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Exception in thread run_worker_1-1:
Traceback (most recent call last):
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1006, in pull_responses
    self._target(*self._args, **self._kwargs)
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 654, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:38187 {grpc_message:"Connection reset by peer", grpc_status:14, created_time:"2022-10-28T00:54:31.921794522+00:00"}"
>
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 254, in run
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 426, in __next__
    for work_request in self._control_stub.Control(get_responses()):
Exception in thread read_grpc_client_inputs:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 426, in __next__
Traceback (most recent call last):
    return self._next()
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 826, in _next
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 826, in _next
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    raise self
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:33835 {grpc_message:"Socket closed", grpc_status:14, created_time:"2022-10-28T00:54:31.921796625+00:00"}"
>
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:40951 {created_time:"2022-10-28T00:54:31.921825386+00:00", grpc_status:14, grpc_message:"Socket closed"}"
>
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 671, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 654, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:38187 {grpc_message:"Connection reset by peer", grpc_status:14, created_time:"2022-10-28T00:54:31.921794522+00:00"}"
>

> Task :sdks:python:test-suites:portable:py39:postCommitPy39

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md

warning: check: missing required meta-data: url

warning: check: missing meta-data: either (author and author_email) or (maintainer and maintainer_email) should be supplied


> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 57m 16s
218 actionable tasks: 158 executed, 54 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/yqtw2rj5v6ucc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python39 #1028

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/1028/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #1027

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/1027/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [23832] Update CHANGES.md


------------------------------------------
[...truncated 10.40 MB...]
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py:42
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
    def call(self, fn, *args, **kwargs):

apache_beam/typehints/pandas_type_compatibility_test.py:66
apache_beam/typehints/pandas_type_compatibility_test.py:66
apache_beam/typehints/pandas_type_compatibility_test.py:66
apache_beam/typehints/pandas_type_compatibility_test.py:66
apache_beam/typehints/pandas_type_compatibility_test.py:66
apache_beam/typehints/pandas_type_compatibility_test.py:66
apache_beam/typehints/pandas_type_compatibility_test.py:66
apache_beam/typehints/pandas_type_compatibility_test.py:66
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:66: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead.
    }).set_index(pd.Int64Index(range(123, 223), name='an_index')),

apache_beam/typehints/pandas_type_compatibility_test.py:89
apache_beam/typehints/pandas_type_compatibility_test.py:89
apache_beam/typehints/pandas_type_compatibility_test.py:89
apache_beam/typehints/pandas_type_compatibility_test.py:89
apache_beam/typehints/pandas_type_compatibility_test.py:89
apache_beam/typehints/pandas_type_compatibility_test.py:89
apache_beam/typehints/pandas_type_compatibility_test.py:89
apache_beam/typehints/pandas_type_compatibility_test.py:89
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:89: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(123, 223), name='an_index'),

apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(475, 575), name='another_index'),

apache_beam/dataframe/io_it_test.py: 3 warnings
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py: 1 warning
apache_beam/examples/dataframe/flight_delays_it_test.py: 1 warning
apache_beam/io/gcp/big_query_query_to_table_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_io_read_it_test.py: 2 warnings
apache_beam/io/gcp/bigquery_read_it_test.py: 7 warnings
apache_beam/io/gcp/bigquery_json_it_test.py: 3 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2485: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/dataframe/io_it_test.py: 3 warnings
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py: 1 warning
apache_beam/examples/dataframe/flight_delays_it_test.py: 1 warning
apache_beam/io/gcp/big_query_query_to_table_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_io_read_it_test.py: 2 warnings
apache_beam/io/gcp/bigquery_read_it_test.py: 7 warnings
apache_beam/io/gcp/bigquery_json_it_test.py: 3 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2487: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/dataframe/io_it_test.py: 3 warnings
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py: 1 warning
apache_beam/examples/dataframe/flight_delays_it_test.py: 1 warning
apache_beam/io/gcp/big_query_query_to_table_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_io_read_it_test.py: 2 warnings
apache_beam/io/gcp/bigquery_read_it_test.py: 7 warnings
apache_beam/io/gcp/bigquery_json_it_test.py: 3 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2511: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/complete/game/game_stats_it_test.py: 2 warnings
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py: 1 warning
apache_beam/examples/complete/game/hourly_team_score_it_test.py: 1 warning
apache_beam/examples/complete/game/leader_board_it_test.py: 2 warnings
apache_beam/io/gcp/big_query_query_to_table_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_write_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_test.py: 6 warnings
apache_beam/io/gcp/bigquery_json_it_test.py: 2 warnings
apache_beam/io/gcp/bigquery_file_loads_test.py: 3 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1992: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/examples/complete/game/game_stats_it_test.py: 2 warnings
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py: 1 warning
apache_beam/examples/complete/game/hourly_team_score_it_test.py: 1 warning
apache_beam/examples/complete/game/leader_board_it_test.py: 2 warnings
apache_beam/io/gcp/big_query_query_to_table_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_write_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_test.py: 6 warnings
apache_beam/io/gcp/bigquery_json_it_test.py: 2 warnings
apache_beam/io/gcp/bigquery_file_loads_test.py: 3 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1998: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/dataframe/io_it_test.py: 2 warnings
apache_beam/io/gcp/bigquery_json_it_test.py: 1 warning
apache_beam/io/gcp/bigquery_read_it_test.py: 9 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/dataframe/io_it_test.py: 2 warnings
apache_beam/io/gcp/bigquery_json_it_test.py: 1 warning
apache_beam/io/gcp/bigquery_read_it_test.py: 9 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py: 1 warning
apache_beam/io/gcp/big_query_query_to_table_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_write_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_test.py: 2 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1988: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py: 1 warning
apache_beam/io/gcp/big_query_query_to_table_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_write_it_test.py: 3 warnings
apache_beam/io/gcp/bigquery_test.py: 3 warnings
apache_beam/io/gcp/bigquery_file_loads_test.py: 3 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1133: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.project = self.project or p.options.view_as(GoogleCloudOptions).project

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py: 1 warning
apache_beam/examples/complete/game/hourly_team_score_it_test.py: 1 warning
apache_beam/io/gcp/big_query_query_to_table_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_write_it_test.py: 3 warnings
apache_beam/io/gcp/bigquery_test.py: 3 warnings
apache_beam/io/gcp/bigquery_file_loads_test.py: 3 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py: 1 warning
apache_beam/examples/complete/game/hourly_team_score_it_test.py: 1 warning
apache_beam/io/gcp/big_query_query_to_table_it_test.py: 4 warnings
apache_beam/io/gcp/bigquery_write_it_test.py: 3 warnings
apache_beam/io/gcp/bigquery_test.py: 3 warnings
apache_beam/io/gcp/bigquery_file_loads_test.py: 3 warnings
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/taxiride_it_test.py::TaxirideIT::test_aggregation
apache_beam/examples/dataframe/taxiride_it_test.py::TaxirideIT::test_enrich
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/dataframe/io.py>:659: FutureWarning: WriteToFiles is experimental.
    return pcoll | fileio.WriteToFiles(

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/taxiride_it_test.py::TaxirideIT::test_aggregation
apache_beam/examples/dataframe/taxiride_it_test.py::TaxirideIT::test_enrich
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/fileio.py>:590: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery_test.py::BigQueryStreamingInsertTransformIntegrationTests::test_multiple_destinations_transform
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1684: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:172: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py::CloudDLPIT::test_deidentification
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:74: FutureWarning: MaskDetectedDetails is experimental.
    | MaskDetectedDetails(

apache_beam/ml/gcp/cloud_dlp_it_test.py::CloudDLPIT::test_inspection
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:85: FutureWarning: InspectForDetails is experimental.
    | InspectForDetails(

apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_regression
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/inference/sklearn_japanese_housing_regression.py>:128: FutureWarning: SklearnModelHandlerPandas is experimental. No backwards-compatibility guarantees.
    model_loader = SklearnModelHandlerPandas(

apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_regression
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/dill/_dill.py>:472: FutureWarning: SklearnModelHandlerPandas is experimental. No backwards-compatibility guarantees.
    obj = StockUnpickler.load(self)

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:695: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:810: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2658: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2659: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2672: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_with_attributes - RuntimeError: Timeout after 600 seconds while waiting for job 2022-10-28_06_37_24-7169339362459240143 enters expected state CANCELLED. Current state is CANCELLING.
===== 1 failed, 85 passed, 17 skipped, 257 warnings in 10958.28s (3:02:38) =====

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 3h 5m 20s
218 actionable tasks: 148 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/mzsivhe75e37i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #1026

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/1026/display/redirect?page=changes>

Changes:

[noreply] Use --release 8 for builds targeting Java 8 (#23771)


------------------------------------------
[...truncated 263.33 KB...]
  Building wheel for docopt (setup.py): finished with status 'done'
  Created wheel for docopt: filename=docopt-0.6.2-py2.py3-none-any.whl size=13723 sha256=9a69b8b25dace8c74a2afaec17c1e078fd738fe1ead08662864333e563ce5672
  Stored in directory: /root/.cache/pip/wheels/70/4a/46/1309fc853b8d395e60bafaf1b6df7845bdd82c95fd59dd8d2b
  Building wheel for future (setup.py): started
  Building wheel for future (setup.py): finished with status 'done'
  Created wheel for future: filename=future-0.18.2-py3-none-any.whl size=491070 sha256=73b2334030b5768515a04a1856bf83b62bcc612469c473cbb8dadceb92a0c3c3
  Stored in directory: /root/.cache/pip/wheels/2f/a0/d3/4030d9f80e6b3be787f19fc911b8e7aa462986a40ab1e4bb94
  Building wheel for google-apitools (setup.py): started
  Building wheel for google-apitools (setup.py): finished with status 'done'
  Created wheel for google-apitools: filename=google_apitools-0.5.31-py3-none-any.whl size=131039 sha256=564f26e25722b05ea24a4221884bd65b8da1a0622097ad37cf46302dee17b42e
  Stored in directory: /root/.cache/pip/wheels/6c/f8/60/b9e91899dbaf25b6314047d3daee379bdd8d61b1dc3fd5ec7f
  Building wheel for google-cloud-profiler (setup.py): started
  Building wheel for google-cloud-profiler (setup.py): finished with status 'done'
  Created wheel for google-cloud-profiler: filename=google_cloud_profiler-3.1.0-cp39-cp39-linux_x86_64.whl size=733254 sha256=cc9da040b66e463880bff20df8e8109593052cae30183817e03d98d228de3ed1
  Stored in directory: /root/.cache/pip/wheels/c0/34/01/3b7d3ee8ab5f1c1c6dd51a502d9f72ab41333f7230f0564ade
Successfully built bs4 crcmod dill docopt future google-apitools google-cloud-profiler
Installing collected packages: tensorboard-plugin-wit, tenacity, sortedcontainers, requests-mock, pytz, python-snappy, pyasn1-modules, pyasn1, parameterized, oauth2client, nose, msgpack, mock, mmh3, libclang, Keras-Preprocessing, keras, iniconfig, hdfs, google-python-cloud-debugger, google-pasta, google-cloud-profiler, google-auth-httplib2, flatbuffers, docopt, deprecation, crcmod, cffi, bs4, astunparse, zstandard, zipp, wrapt, Werkzeug, websocket-client, urllib3, uritemplate, typing_extensions, tqdm, tomli, threadpoolctl, testcontainers, termcolor, tensorflow-io-gcs-filesystem, tensorflow-estimator, tensorflow, tensorboard-data-server, tensorboard, sqlparse, SQLAlchemy, soupsieve, six, scipy, scikit-learn, rsa, requests-oauthlib, requests, regex, PyYAML, python-dateutil, pytest-xdist, pytest-timeout, pytest-forked, pytest, pyparsing, PyMySQL, pymongo, PyHamcrest, pydot, pycparser, pyarrow, py, psycopg2-binary, protobuf, proto-plus, pluggy, pbr, pandas, packaging, overrides, orjson, opt-einsum, objsize, oauthlib, numpy, nltk, MarkupSafe, Markdown, joblib, importlib-metadata, idna, hypothesis, httplib2, h5py, guppy3, grpcio-status, grpcio, grpc-google-iam-v1, greenlet, googleapis-common-protos, google-resumable-media, google-crc32c, google-cloud-vision, google-cloud-videointelligence, google-cloud-storage, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-pubsub, google-cloud-language, google-cloud-firestore, google-cloud-dlp, google-cloud-datastore, google-cloud-core, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-auth-oauthlib, google-auth, google-apitools, google-api-python-client, google-api-core, gast, future, freezegun, firebase-admin, fasteners, fastavro, execnet, exceptiongroup, docker, dill, Cython, cryptography, cloudpickle, click, charset-normalizer, certifi, cachetools, CacheControl, beautifulsoup4, attrs, absl-py

> Task :sdks:java:container:java8:dockerPrepare

> Task :sdks:python:test-suites:portable:py39:installGcpTest
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.44.0.dev0) (1.11.0)

> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest

> Task :sdks:python:test-suites:direct:py39:hdfsIntegrationTest
Stopping hdfs_it-jenkins-beam_postcommit_python39-1026_namenode_1 ... done

real	3m50.132s
user	0m1.947s
sys	0m0.245s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python39-1026 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python39-1026_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python39-1026_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python39-1026_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python39-1026_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python39-1026_namenode_1 ... done

> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:io:kinesis:expansion-service:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:expansion-service:compileJava NO-SOURCE
> Task :sdks:java:io:kinesis:expansion-service:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:expansion-service:classes UP-TO-DATE

> Task :sdks:python:test-suites:direct:py39:hdfsIntegrationTest
Removing hdfs_it-jenkins-beam_postcommit_python39-1026_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python39-1026_test_net

real	0m7.221s
user	0m0.995s
sys	0m0.170s

> Task :sdks:java:container:java8:docker
Sending build context to Docker daemon  209.1MB
Step 1/21 : ARG java_version
Step 2/21 : FROM openjdk:${java_version}-bullseye
 ---> b273004037cc
Step 3/21 : MAINTAINER "Apache Beam <de...@beam.apache.org>"
 ---> Running in d82f65b0a628
Removing intermediate container d82f65b0a628
 ---> bdea6641973f
Step 4/21 : ARG pull_licenses
 ---> Running in 96104d65b21b
Removing intermediate container 96104d65b21b
 ---> 758ff4fbd1bf
Step 5/21 : ADD target/slf4j-api.jar /opt/apache/beam/jars/
 ---> da2067fd4a5e
Step 6/21 : ADD target/slf4j-jdk14.jar /opt/apache/beam/jars/
 ---> 8dee04161a06
Step 7/21 : ADD target/beam-sdks-java-harness.jar /opt/apache/beam/jars/
 ---> d5ffbef84275
Step 8/21 : ADD target/beam-sdks-java-io-kafka.jar /opt/apache/beam/jars/
 ---> 63f5bd064b57
Step 9/21 : ADD target/kafka-clients.jar /opt/apache/beam/jars/
 ---> d5fdd9b64c25
Step 10/21 : COPY target/jamm.jar target/open-module-agent*.jar /opt/apache/beam/jars/
 ---> 01f13ce5d2f6
Step 11/21 : ADD target/linux_amd64/boot /opt/apache/beam/

> Task :sdks:python:test-suites:portable:py39:installGcpTest
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.44.0.dev0-py3-none-any.whl size=2905045 sha256=e7b20ad71efadf0e27691ded80dbc630b4d25e96714ada0d30629d75704f109c
  Stored in directory: /home/jenkins/.cache/pip/wheels/82/26/da/eb911cb16584806815500e63d1228ac5f2e615f5fd6e144bf2
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, iniconfig, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, typing-extensions, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pymysql, pymongo, pyhamcrest, pydot, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, idna, httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pytest, pandas, oauth2client, mock, hypothesis, grpcio-status, google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest-timeout, pytest-forked, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, msrest, google-cloud-core, boto3, apache-beam, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, azure-storage-blob, google-cloud-pubsublite, google-cloud-bigquery

> Task :sdks:java:io:kinesis:expansion-service:shadowJar

> Task :sdks:python:test-suites:direct:py39:setupVirtualenv
Requirement already satisfied: pip in <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages> (21.2.4)
Collecting pip
  Using cached pip-22.3-py3-none-any.whl (2.1 MB)
Installing collected packages: pip
  Attempting uninstall: pip
    Found existing installation: pip 21.2.4
    Uninstalling pip-21.2.4:
      Successfully uninstalled pip-21.2.4
Successfully installed pip-22.3
Ignoring grpcio: markers 'sys_platform == "darwin"' don't match your environment
Ignoring protobuf: markers 'python_version == "3.10" and sys_platform == "darwin"' don't match your environment
Collecting tox==3.20.1
  Using cached tox-3.20.1-py2.py3-none-any.whl (83 kB)
Requirement already satisfied: setuptools in <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages> (from -r <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/build-requirements.txt> (line 20)) (58.1.0)

> Task :sdks:java:container:java8:docker
 ---> 63f4f639f9e0
Step 12/21 : COPY target/LICENSE /opt/apache/beam/
 ---> d82a6d763aec
Step 13/21 : COPY target/NOTICE /opt/apache/beam/
 ---> 0325aaf3af5b
Step 14/21 : ADD target/third_party_licenses /opt/apache/beam/third_party_licenses/
 ---> 1c58e9c7a309
Step 15/21 : COPY target/LICENSE target/options/* /opt/apache/beam/options/
 ---> 8d062e954abc
Step 16/21 : RUN rm /opt/apache/beam/options/LICENSE
 ---> Running in e12918ded867
Removing intermediate container e12918ded867
 ---> ede4842bf3ef
Step 17/21 : COPY target/LICENSE target/go-licenses/* /opt/apache/beam/third_party_licenses/golang/
 ---> 19d9d2dc4fe5
Step 18/21 : RUN rm /opt/apache/beam/third_party_licenses/golang/LICENSE
 ---> Running in 15f87b26cb44
Removing intermediate container 15f87b26cb44
 ---> 94f0b34bd8ca
Step 19/21 : RUN if [ "${pull_licenses}" = "false" ] ; then     rm -rf /opt/apache/beam/third_party_licenses ;    fi
 ---> Running in a14dc65182d1
Removing intermediate container a14dc65182d1
 ---> 2b052ed8ec8c
Step 20/21 : COPY target/profiler/* /opt/google_cloud_profiler/
 ---> 21723c5092e9
Step 21/21 : ENTRYPOINT ["/opt/apache/beam/boot"]
 ---> Running in 5bcff7339904
Removing intermediate container 5bcff7339904
 ---> d440b1038e55
Successfully built d440b1038e55
Successfully tagged apache/beam_java8_sdk:2.44.0.dev

> Task :sdks:python:test-suites:direct:py39:setupVirtualenv
Collecting setuptools
  Using cached setuptools-65.5.0-py3-none-any.whl (1.2 MB)
Collecting wheel>=0.36.0
  Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB)
Collecting grpcio-tools==1.37.0
  Using cached grpcio_tools-1.37.0-cp39-cp39-manylinux2014_x86_64.whl (2.5 MB)
Collecting mypy-protobuf==1.18
  Using cached mypy_protobuf-1.18-py3-none-any.whl (7.3 kB)
Collecting distlib==0.3.1
  Using cached distlib-0.3.1-py2.py3-none-any.whl (335 kB)
Collecting numpy<1.23.0,>=1.14.3
  Using cached numpy-1.22.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (16.8 MB)
Collecting packaging>=14
  Using cached packaging-21.3-py3-none-any.whl (40 kB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.16.6-py3-none-any.whl (8.8 MB)
Collecting pluggy>=0.12.0
  Using cached pluggy-1.0.0-py2.py3-none-any.whl (13 kB)
Collecting toml>=0.9.4
  Using cached toml-0.10.2-py2.py3-none-any.whl (16 kB)
Collecting py>=1.4.17
  Using cached py-1.11.0-py2.py3-none-any.whl (98 kB)
Collecting six>=1.14.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting filelock>=3.0.0
  Using cached filelock-3.8.0-py3-none-any.whl (10 kB)
Collecting protobuf<4.0dev,>=3.5.0.post1
  Using cached protobuf-3.20.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.0 MB)
Collecting grpcio>=1.37.0
  Using cached grpcio-1.50.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB)
Collecting pyparsing!=3.0.5,>=2.0.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.16.5-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.4-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.3-py2.py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.2-py2.py3-none-any.whl (8.8 MB)
Collecting platformdirs<3,>=2
  Using cached platformdirs-2.5.2-py3-none-any.whl (14 kB)
Installing collected packages: distlib, wheel, toml, six, setuptools, pyparsing, py, protobuf, pluggy, platformdirs, numpy, filelock, virtualenv, packaging, mypy-protobuf, grpcio, tox, grpcio-tools
  Attempting uninstall: setuptools
    Found existing installation: setuptools 58.1.0
    Uninstalling setuptools-58.1.0:
      Successfully uninstalled setuptools-58.1.0

> Task :sdks:python:test-suites:portable:py39:installGcpTest FAILED
Terminated

> Task :sdks:python:test-suites:dataflow:py39:installGcpTest FAILED
Terminated

> Task :sdks:python:container:py39:docker FAILED

> Task :sdks:python:test-suites:direct:py39:setupVirtualenv
Terminated

The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=8b5d16af-1f23-4e74-9d9a-29316031bb03, currentDir=<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 2960669
  log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-2960669.out.log
----- Last  20 lines from daemon log file - daemon-2960669.out.log -----
Collecting grpcio>=1.37.0
  Using cached grpcio-1.50.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB)
Collecting pyparsing!=3.0.5,>=2.0.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.16.5-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.4-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.3-py2.py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.2-py2.py3-none-any.whl (8.8 MB)
Collecting platformdirs<3,>=2
  Using cached platformdirs-2.5.2-py3-none-any.whl (14 kB)
Installing collected packages: distlib, wheel, toml, six, setuptools, pyparsing, py, protobuf, pluggy, platformdirs, numpy, filelock, virtualenv, packaging, mypy-protobuf, grpcio, tox, grpcio-tools
  Attempting uninstall: setuptools
    Found existing installation: setuptools 58.1.0
    Uninstalling setuptools-58.1.0:
      Successfully uninstalled setuptools-58.1.0
Terminated
Terminated
Terminated
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org