You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/10/13 02:13:17 UTC

Build failed in Jenkins: beam_PostCommit_Python39 #2423

See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2423/display/redirect>

Changes:


------------------------------------------
[...truncated 8.90 MB...]
INFO:root:severity: INFO
timestamp {
  seconds: 1697158015
  nanos: 233019590
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697158015
  nanos: 233436107
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:275"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697158015
  nanos: 233537673
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/data_plane.py:803"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697158015
  nanos: 233641624
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:904"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697158015
  nanos: 234145164
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:287"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697158015
  nanos: 234383583
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Completed job in 24.529298067092896 seconds with state DONE.
INFO:root:Completed job in 24.529298067092896 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py39:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:40971
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7faa10eee9d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7faa10eeea60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7faa10eef1f0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.52.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempcuz611xt/artifactscbl5cr0y' '--job-port' '53683' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:03 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
WARNING:root:Waiting for grpc channel to be ready at localhost:53683.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:04 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:39103
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:05 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:36661
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:05 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:53683
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:05 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C
WARNING:root:Waiting for grpc channel to be ready at localhost:53683.
WARNING:root:Waiting for grpc channel to be ready at localhost:53683.
WARNING:root:Waiting for grpc channel to be ready at localhost:53683.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:09 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_37ef9978-9bac-4bba-b76c-901eda4307ce.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:09 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_37ef9978-9bac-4bba-b76c-901eda4307ce.ref_Environment_default_environment_1.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:09 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_37ef9978-9bac-4bba-b76c-901eda4307ce.null.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:09 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_37ef9978-9bac-4bba-b76c-901eda4307ce.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:09 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1013004709-58e1b2fc_4293e171-cc76-4cc2-bb89-d8a5596d609e
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:09 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1013004709-58e1b2fc_4293e171-cc76-4cc2-bb89-d8a5596d609e
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:10 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:11 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.util.log: Logging initialized @13785ms to org.sparkproject.jetty.util.log.Slf4jLog
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.Server: jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 1.8.0_382-8u382-ga-1~20.04.1-b05
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.Server: Started @13948ms
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector@63d0acf{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5df93652{/jobs,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4bab1498{/jobs/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@f076530{/jobs/job,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@17707495{/jobs/job/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3961b99a{/stages,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@42599111{/stages/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@b1914f{/stages/stage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e81f218{/stages/stage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e74d9e1{/stages/pool,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c86ddbf{/stages/pool/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fb37b21{/storage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@f77ca69{/storage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2cd18508{/storage/rdd,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c510d8c{/storage/rdd/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@38d94f17{/environment,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24ea5a7c{/environment/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4da4a985{/executors,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@33448c57{/executors/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5bf7e67e{/executors/threadDump,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51e9428f{/executors/threadDump/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a79972c{/static,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@77d5a24c{/,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24a12477{/api,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4d137749{/jobs/job/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@28487f6f{/stages/stage/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:14 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@526f8897{/metrics/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:14 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:14 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1013004709-58e1b2fc_4293e171-cc76-4cc2-bb89-d8a5596d609e on Spark master local[4]
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 104857600
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:38359.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37751.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:42613
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:18 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1013004709-58e1b2fc_4293e171-cc76-4cc2-bb89-d8a5596d609e: Pipeline translated successfully. Computing outputs
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:18 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1013004709-58e1b2fc_4293e171-cc76-4cc2-bb89-d8a5596d609e finished.
INFO:apache_beam.utils.subprocess_server:23/10/13 00:47:18 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@63d0acf{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:42613 {grpc_message:"recvmsg:Connection reset by peer", grpc_status:14, created_time:"2023-10-13T00:47:18.964166531+00:00"}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    self.run()
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self.run()
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 669, in <lambda>
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 262, in run
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1035, in pull_responses
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    return self._next()
    raise self
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37751 {created_time:"2023-10-13T00:47:18.964231207+00:00", grpc_status:14, grpc_message:"Socket closed"}"
>
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:42613 {grpc_message:"recvmsg:Connection reset by peer", grpc_status:14, created_time:"2023-10-13T00:47:18.964166531+00:00"}"
>grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:38359 {created_time:"2023-10-13T00:47:18.964284499+00:00", grpc_status:14, grpc_message:"Socket closed"}"
>


> Task :sdks:python:test-suites:portable:py39:postCommitPy39
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 1m 49s
219 actionable tasks: 156 executed, 59 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/t66jlflnefz24

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python39 #2440

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2440/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2439

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2439/display/redirect>

Changes:


------------------------------------------
[...truncated 8.19 MB...]
  nanos: 21591663
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697503251
  nanos: 21735429
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:275"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697503251
  nanos: 21806240
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/data_plane.py:803"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697503251
  nanos: 21868705
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:904"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697503251
  nanos: 22056818
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:287"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697503251
  nanos: 22117614
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211"
thread: "MainThread"


> Task :sdks:python:test-suites:dataflow:py39:postCommitIT

[gw4] PASSED apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it 
apache_beam/io/fileio_test.py::MatchIntegrationTest::test_transform_on_gcs 
> Task :sdks:python:test-suites:portable:py39:portableLocalRunnerTestWithRequirementsFile
16834cd387c1488afd9e0c3aa479cddabda9bd2d99124d37dc2b74f9ae548ff2
INFO:apache_beam.runners.portability.local_job_service:Completed job in 36.58824133872986 seconds with state DONE.
INFO:root:Completed job in 36.58824133872986 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py39:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:33623
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f3cdefa59d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f3cdefa5a60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f3cdefa71f0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.52.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempxr9f0wah/artifactsxlhf6ysu' '--job-port' '41823' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:44465
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:43231
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:41823
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C
WARNING:root:Waiting for grpc channel to be ready at localhost:41823.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_7b590d0f-ef95-440c-b8b7-0cc1faa1b24c.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_7b590d0f-ef95-440c-b8b7-0cc1faa1b24c.ref_Environment_default_environment_1.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_7b590d0f-ef95-440c-b8b7-0cc1faa1b24c.null.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_7b590d0f-ef95-440c-b8b7-0cc1faa1b24c.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:12 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:12 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.util.log: Logging initialized @7562ms to org.sparkproject.jetty.util.log.Slf4jLog
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.Server: jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 1.8.0_382-8u382-ga-1~20.04.1-b05
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.Server: Started @7665ms
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector@3c4bd5a3{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4ffecd4a{/jobs,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2bdbffaf{/jobs/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@18eb7627{/jobs/job,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3bfd4b09{/jobs/job/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4a84996b{/stages,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4831e9ed{/stages/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7ceef8a7{/stages/stage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ae34995{/stages/stage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@62663df0{/stages/pool,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e7037a9{/stages/pool/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51666e5d{/storage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5bde158a{/storage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59a617f5{/storage/rdd,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@235cf174{/storage/rdd/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7551a0a7{/environment,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@175c0565{/environment/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7bee8cfd{/executors,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@23c71438{/executors/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d97ac31{/executors/threadDump,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@48e47770{/executors/threadDump/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@70048b0a{/static,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1ce95677{/,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5183b49c{/api,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63e31a89{/jobs/job/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4f34ad13{/stages/stage/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1ba4d4e9{/metrics/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:14 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf on Spark master local[4]
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 104857600
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:45397.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37777.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:37319
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf: Pipeline translated successfully. Computing outputs
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf finished.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:17 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@3c4bd5a3{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1035, in pull_responses
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37319 {created_time:"2023-10-17T00:41:17.671956048+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 262, in run
Exception in thread read_grpc_client_inputs:
    for response in responses:
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    self.run()
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
  File "/usr/lib/python3.9/threading.py", line 910, in run
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    self._target(*self._args, **self._kwargs)
    raise self
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 669, in <lambda>
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37777 {created_time:"2023-10-17T00:41:17.671978731+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
    raise self
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:45397 {grpc_message:"Socket closed", grpc_status:14, created_time:"2023-10-17T00:41:17.672021915+00:00"}"
>
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37319 {created_time:"2023-10-17T00:41:17.671956048+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>

> Task :sdks:python:test-suites:portable:py39:postCommitPy39

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 52

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 57m 50s
219 actionable tasks: 158 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/hz3rovvvajcsy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2438

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2438/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #28656: Update Google Cloud Java Libraries BOM from


------------------------------------------
[...truncated 12.17 MB...]
[gw6] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw6] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw6] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
[gw6] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 

=================================== FAILURES ===================================
_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________
[gw2] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpn0g1_z6q/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 7894.73s (2:11:34) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 18m 54s
219 actionable tasks: 159 executed, 56 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/lvmosllqebkm2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2437

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2437/display/redirect>

Changes:


------------------------------------------
[...truncated 12.07 MB...]
[gw3] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw3] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw3] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
[gw3] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 

=================================== FAILURES ===================================
_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________
[gw3] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpn6ierrxt/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 7609.58s (2:06:49) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 13m 4s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/w3czffujkh6y4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2436

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2436/display/redirect>

Changes:


------------------------------------------
[...truncated 12.12 MB...]
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 

=================================== FAILURES ===================================
_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________
[gw4] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmppbe856em/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 7023.64s (1:57:03) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 3m 1s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/eirco74gayf5m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2435

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2435/display/redirect>

Changes:


------------------------------------------
[...truncated 1.03 MB...]
    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmprtu3kw54/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 8585.42s (2:23:05) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py39:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 29m 15s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/jl3wplyv7aq2k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2434

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2434/display/redirect>

Changes:


------------------------------------------
[...truncated 1.05 MB...]
configfile: pytest.ini
plugins: timeout-2.2.0, xdist-3.3.1, requests-mock-1.11.0, hypothesis-6.87.4
timeout: 4500.0s
timeout method: signal
timeout func_only: False
created: 8/8 workers
8 workers [9 items]

scheduling tests via LoadScheduling

apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_multi_batch 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_multi_batch 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_single_batch 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_multi_batch 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_datatable_single_batch 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch_large_model 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_datatable_multi_batch 
[gw4] PASSED apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch_large_model 
[gw3] PASSED apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch 
[gw0] PASSED apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_datatable_single_batch 
[gw6] PASSED apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_single_batch 
[gw2] PASSED apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_multi_batch 
[gw7] PASSED apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_multi_batch 
[gw1] PASSED apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_datatable_multi_batch 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_single_batch 
[gw5] PASSED apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_multi_batch 
[gw1] PASSED apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_single_batch Exception ignored in: <function Booster.__del__ at 0x7fe74d4c55e0>
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f8182ed65e0>
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f8045c2d5e0>
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f257ac555e0>
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f80e869a5e0>
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7faf0582f5e0>
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f9f0603a5e0>
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f70509fe5e0>
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'


=============================== warnings summary ===============================
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py>:35: DeprecationWarning: ml_dtypes.float8_e4m3b11 is deprecated. Use ml_dtypes.float8_e4m3b11fnuz
    from tensorflow.tsl.python.lib.core import pywrap_ml_dtypes

apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/ml/inference/vertex_ai_inference_it_test.py>:68: PytestUnknownMarkWarning: Unknown pytest.mark.uses_vertex_ai - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.uses_vertex_ai

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-direct-py39.xml> -
================= 9 passed, 10 skipped, 16 warnings in 21.43s ==================

> Task :sdks:python:test-suites:direct:py39:inferencePostCommitIT

> Task :sdks:python:test-suites:direct:py39:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --numprocesses=8 --timeout=4500 --color=yes --log-cli-level=INFO apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT apache_beam/io/gcp/bigquery_io_read_it_test.py apache_beam/io/gcp/bigquery_read_it_test.py apache_beam/io/gcp/bigquery_write_it_test.py apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py
>>>   collect markers: 
============================= test session starts ==============================
platform linux -- Python 3.9.10, pytest-7.4.2, pluggy-1.3.0
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python>
configfile: pytest.ini
plugins: timeout-2.2.0, xdist-3.3.1, requests-mock-1.11.0, hypothesis-6.87.4
timeout: 4500.0s
timeout method: signal
timeout func_only: False
created: 8/8 workers
8 workers [38 items]

scheduling tests via LoadScheduling

apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source 
apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_with_attributes 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve_specifying_only_table 
apache_beam/io/gcp/bigquery_io_read_it_test.py::BigqueryIOReadIT::test_bigquery_read_custom_1M_python 
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types 
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_standard_sql 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source 
[gw4] PASSED apache_beam/io/gcp/bigquery_io_read_it_test.py::BigqueryIOReadIT::test_bigquery_read_custom_1M_python 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_iobase_source 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve_specifying_only_table 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve_with_direct_read 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve_with_direct_read 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction_rows 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters 
[gw6] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction_rows 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction 
[gw4] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_iobase_source 
[gw6] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve 
[gw0] PASSED apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it 
apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_data_only 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw3] PASSED apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_standard_sql 
apache_beam/io/gcp/bigquery_io_read_it_test.py::BigqueryIOReadIT::test_bigquery_read_1M_python 
[gw2] PASSED apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types 
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types_avro 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
[gw4] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction 
[gw6] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write 
[gw3] PASSED apache_beam/io/gcp/bigquery_io_read_it_test.py::BigqueryIOReadIT::test_bigquery_read_1M_python 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_insert_non_transient_api_call_error 
[gw3] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_insert_non_transient_api_call_error 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_1 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_insert_errors_reporting 
[gw7] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_insert_errors_reporting 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_without_schema 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_schema_autodetect 
[gw6] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_0 
[gw4] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update 
[gw4] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update 
[gw2] PASSED apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types_avro 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_new_types 
[gw7] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_without_schema 
apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit 
[gw5] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_schema_autodetect 
[gw1] PASSED apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_with_attributes 
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_legacy_sql 
[gw2] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_new_types 
[gw7] PASSED apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit 
[gw0] PASSED apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_data_only 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw3] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_1 
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_2 
[gw1] PASSED apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_legacy_sql 
[gw6] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_0 
[gw0] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw3] PASSED apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_2 

=============================== warnings summary ===============================
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1new/datastoreio.py>:250: UserWarning: Detected filter using positional arguments. Prefer using the 'filter' keyword argument instead.
    query.add_filter('kind_name', '=', kind_name)

apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1new/datastoreio.py>:251: UserWarning: Detected filter using positional arguments. Prefer using the 'filter' keyword argument instead.
    query.add_filter('timestamp', '=', latest_timestamp)

apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/google/cloud/datastore/query.py>:234: UserWarning: Detected filter using positional arguments. Prefer using the 'filter' keyword argument instead.
    self.add_filter(property_name, operator, value)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-direct-py39.xml> -
================== 38 passed, 5 warnings in 143.44s (0:02:23) ==================

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py39:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 23m 24s
211 actionable tasks: 147 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/47dmjq4njbl2w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2433

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2433/display/redirect>

Changes:


------------------------------------------
[...truncated 9.39 MB...]
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:275"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697373651
  nanos: 424649953
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/data_plane.py:803"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697373651
  nanos: 424743652
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:904"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697373651
  nanos: 425013065
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:287"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697373651
  nanos: 425153255
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211"
thread: "MainThread"

a6da070e4b7d3379a3bc32b67a1e4c146d3a0b367795911258c68b2094f95e45
INFO:apache_beam.runners.portability.local_job_service:Completed job in 20.181049823760986 seconds with state DONE.
INFO:root:Completed job in 20.181049823760986 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py39:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:32789
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f0195fe49d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f0195fe4a60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f0195fe51f0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.52.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempm53dhpv4/artifactsyocb7sg_' '--job-port' '48547' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:57 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:44029
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:34111
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:48547
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_26490f36-690c-4a73-b8b8-2246e20d614f.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_26490f36-690c-4a73-b8b8-2246e20d614f.ref_Environment_default_environment_1.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_26490f36-690c-4a73-b8b8-2246e20d614f.null.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_26490f36-690c-4a73-b8b8-2246e20d614f.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:59 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:59 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:59 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.util.log: Logging initialized @5024ms to org.sparkproject.jetty.util.log.Slf4jLog
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.Server: jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 1.8.0_382-8u382-ga-1~20.04.1-b05
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.Server: Started @5128ms
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector@67368d56{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7e728999{/jobs,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@78e909e2{/jobs/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@70fe89ce{/jobs/job,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c7e8d42{/jobs/job/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d7e1195{/stages,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@75324861{/stages/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16e64871{/stages/stage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4d2fe779{/stages/stage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1b2b6f4a{/stages/pool,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@34c1e13f{/stages/pool/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@ac55d4a{/storage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@93ccbaa{/storage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@425f61f7{/storage/rdd,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@67328ffb{/storage/rdd/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e942a39{/environment,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26db2b3a{/environment/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5309903b{/executors,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5034b133{/executors/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e5225ea{/executors/threadDump,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@a5a09c6{/executors/threadDump/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2bee815d{/static,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@286183a{/,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@571d0664{/api,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@660e75cb{/jobs/job/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2893ae7d{/stages/stage/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:01 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@9b1a25{/metrics/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:01 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:01 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90 on Spark master local[4]
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 104857600
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:37851.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37111.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36325
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90: Pipeline translated successfully. Computing outputs
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90 finished.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:04 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@67368d56{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    self.run()
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1035, in pull_responses
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 262, in run
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:36325 {created_time:"2023-10-15T12:41:04.625958996+00:00", grpc_status:14, grpc_message:"Socket closed"}"
>
    for response in responses:
    for work_request in self._control_stub.Control(get_responses()):
Exception in thread read_grpc_client_inputs:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    return self._next()
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37111 {grpc_message:"recvmsg:Connection reset by peer", grpc_status:14, created_time:"2023-10-15T12:41:04.625974273+00:00"}"
>
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 669, in <lambda>
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37851 {created_time:"2023-10-15T12:41:04.6259779+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:36325 {created_time:"2023-10-15T12:41:04.625958996+00:00", grpc_status:14, grpc_message:"Socket closed"}"
>

> Task :sdks:python:test-suites:portable:py39:postCommitPy39
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 52

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 3m 8s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/cihsejzvzwfga

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2432

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2432/display/redirect>

Changes:


------------------------------------------
[...truncated 12.23 MB...]
[gw4] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw4] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw4] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
[gw4] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 

=================================== FAILURES ===================================
_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________
[gw4] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpq01hq2mj', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpq01hq2mj', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpq01hq2mj', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpq01hq2mj', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpq01hq2mj', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpnaiukuqg/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpq01hq2mj', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpq01hq2mj', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 6650.89s (1:50:50) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 1h 57m 15s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/lh4fpzfgyvb3m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2431

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2431/display/redirect>

Changes:


------------------------------------------
[...truncated 12.27 MB...]
[gw1] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw1] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw1] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
[gw1] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 

=================================== FAILURES ===================================
_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________
[gw1] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmp0yelmv2f', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmp0yelmv2f', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmp0yelmv2f', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmp0yelmv2f', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmp0yelmv2f', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpedffdd1s/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmp0yelmv2f', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmp0yelmv2f', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 6652.36s (1:50:52) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 1h 57m 16s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/3ilegjmwud3og

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2430

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2430/display/redirect>

Changes:


------------------------------------------
[...truncated 11.94 MB...]
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 

=================================== FAILURES ===================================
_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________
[gw5] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpna7kx62a', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpna7kx62a', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpna7kx62a', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpna7kx62a', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpna7kx62a', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpngwrip9k/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpna7kx62a', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpna7kx62a', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 6697.86s (1:51:37) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 1h 57m 40s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/z2cav62qdq3ts

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2429

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2429/display/redirect>

Changes:


------------------------------------------
[...truncated 11.25 MB...]
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
[gw7] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 

=================================== FAILURES ===================================
_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________
[gw2] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpgblsme1w', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpgblsme1w', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpgblsme1w', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpgblsme1w', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpgblsme1w', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpb_mfm_ch/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpgblsme1w', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpgblsme1w', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 7141.48s (1:59:01) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 5m 38s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/n5aoe6tptqmdc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2428

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2428/display/redirect?page=changes>

Changes:

[noreply] [RRIO] Stub the RequestResponseIO transform (#28950)


------------------------------------------
[...truncated 12.06 MB...]
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpprp8na5y', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpprp8na5y', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpprp8na5y', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpprp8na5y', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpikcjh9y6/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpprp8na5y', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpprp8na5y', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 6897.64s (1:54:57) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 52

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 1m 4s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/bw4zm6bkbm7y4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2427

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2427/display/redirect>

Changes:


------------------------------------------
[...truncated 9.37 MB...]
thread: "Thread-13"

INFO:root:severity: INFO
timestamp {
  seconds: 1697244015
  nanos: 219029664
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697244015
  nanos: 219252824
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:275"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697244015
  nanos: 219461441
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/data_plane.py:803"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697244015
  nanos: 219605445
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:904"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697244015
  nanos: 219964742
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:287"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697244015
  nanos: 220164060
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211"
thread: "MainThread"

b6420caf57d331b376d391c827888e752fa6308422b3e4330611dfd121db6911
INFO:apache_beam.runners.portability.local_job_service:Completed job in 20.98348069190979 seconds with state DONE.
INFO:root:Completed job in 20.98348069190979 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py39:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:45617
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f60fb14f9d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f60fb14fa60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f60fb1501f0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.52.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temp30dhuuqe/artifactsho635gdr' '--job-port' '36711' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:22 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:23 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:35507
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:23 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:35195
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:23 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:36711
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:23 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C
WARNING:root:Waiting for grpc channel to be ready at localhost:36711.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:25 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_6ce6c200-3a10-4c62-89a2-2f594e904dfc.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:25 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_6ce6c200-3a10-4c62-89a2-2f594e904dfc.ref_Environment_default_environment_1.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:25 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_6ce6c200-3a10-4c62-89a2-2f594e904dfc.null.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:25 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_6ce6c200-3a10-4c62-89a2-2f594e904dfc.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:25 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1014004025-8221c285_76444f6f-8770-41ff-9447-a4ba3123e0cb
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:25 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1014004025-8221c285_76444f6f-8770-41ff-9447-a4ba3123e0cb
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:25 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:26 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:27 INFO org.sparkproject.jetty.util.log: Logging initialized @8264ms to org.sparkproject.jetty.util.log.Slf4jLog
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:27 INFO org.sparkproject.jetty.server.Server: jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 1.8.0_382-8u382-ga-1~20.04.1-b05
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.Server: Started @8395ms
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector@23791354{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5c16105a{/jobs,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@278c2987{/jobs/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@725a3f4c{/jobs/job,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4f1a91d0{/jobs/job/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@56aaa1b0{/stages,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@70c290f{/stages/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@23affc29{/stages/stage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63e2f2ab{/stages/stage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@153e7a89{/stages/pool,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5e0c7169{/stages/pool/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@18921271{/storage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@72b9819b{/storage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@720e3073{/storage/rdd,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1a618a38{/storage/rdd/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5e096223{/environment,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3f21358d{/environment/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4daeb260{/executors,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68d70325{/executors/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16ee4dae{/executors/threadDump,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7ee59a82{/executors/threadDump/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a24b37b{/static,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2f55692f{/,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@170cc13d{/api,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ca77fc{/jobs/job/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bf1de1b{/stages/stage/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@799005e6{/metrics/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1014004025-8221c285_76444f6f-8770-41ff-9447-a4ba3123e0cb on Spark master local[4]
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 104857600
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:44001.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:39885.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:35049
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1014004025-8221c285_76444f6f-8770-41ff-9447-a4ba3123e0cb: Pipeline translated successfully. Computing outputs
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1014004025-8221c285_76444f6f-8770-41ff-9447-a4ba3123e0cb finished.
INFO:apache_beam.utils.subprocess_server:23/10/14 00:40:31 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@23791354{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:35049 {created_time:"2023-10-14T00:40:32.03490841+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self.run()
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 669, in <lambda>
    self._target(*self._args, **self._kwargs)
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1035, in pull_responses
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 262, in run
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    for work_request in self._control_stub.Control(get_responses()):
    return self._next()
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
    raise self
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:35049 {created_time:"2023-10-14T00:40:32.03490841+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:39885 {created_time:"2023-10-14T00:40:32.034908476+00:00", grpc_status:14, grpc_message:"Socket closed"}"
>grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:44001 {grpc_message:"recvmsg:Connection reset by peer", grpc_status:14, created_time:"2023-10-14T00:40:32.03493178+00:00"}"
>


> Task :sdks:python:test-suites:portable:py39:postCommitPy39
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 10m 29s
219 actionable tasks: 156 executed, 59 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/n5zzs66iya2gm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2426

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2426/display/redirect?page=changes>

Changes:

[noreply] Bump org.checkerframework:checkerframework-gradle-plugin (#28979)


------------------------------------------
[...truncated 12.21 MB...]
[gw3] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw3] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw3] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
[gw3] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 

=================================== FAILURES ===================================
_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________
[gw3] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmppfhi_z7m', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmppfhi_z7m', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmppfhi_z7m', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmppfhi_z7m', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmppfhi_z7m', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpd7geehht/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmppfhi_z7m', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmppfhi_z7m', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 9089.79s (2:31:29) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 38m 39s
219 actionable tasks: 172 executed, 43 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/dmoyoqgprdomo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2425

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2425/display/redirect>

Changes:


------------------------------------------
[...truncated 12.30 MB...]
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 
[gw5] PASSED apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner 

=================================== FAILURES ===================================
_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________
[gw4] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpm180jxj1', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
>       out = subprocess.check_output(*args, **kwargs)

apache_beam/utils/processes.py:89: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/subprocess.py:424: in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpm180jxj1', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1

    def run(*popenargs,
            input=None, capture_output=False, timeout=None, check=False, **kwargs):
        """Run command with arguments and return a CompletedProcess instance.
    
        The returned instance will have attributes args, returncode, stdout and
        stderr. By default, stdout and stderr are not captured, and those attributes
        will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
    
        If check is True and the exit code was non-zero, it raises a
        CalledProcessError. The CalledProcessError object will have the return code
        in the returncode attribute, and output & stderr attributes if those streams
        were captured.
    
        If timeout is given, and the process takes too long, a TimeoutExpired
        exception will be raised.
    
        There is an optional argument "input", allowing you to
        pass bytes or a string to the subprocess's stdin.  If you use this argument
        you may not also use the Popen constructor's "stdin" argument, as
        it will be used internally.
    
        By default, all communication is in bytes, and therefore any "input" should
        be bytes, and the stdout and stderr will be bytes. If in text mode, any
        "input" should be a string, and stdout and stderr will be strings decoded
        according to locale encoding, or by "encoding" if set. Text mode is
        triggered by setting any of text, encoding, errors or universal_newlines.
    
        The other arguments are the same as for the Popen constructor.
        """
        if input is not None:
            if kwargs.get('stdin') is not None:
                raise ValueError('stdin and input arguments may not both be used.')
            kwargs['stdin'] = PIPE
    
        if capture_output:
            if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
                raise ValueError('stdout and stderr arguments may not be used '
                                 'with capture_output.')
            kwargs['stdout'] = PIPE
            kwargs['stderr'] = PIPE
    
        with Popen(*popenargs, **kwargs) as process:
            try:
                stdout, stderr = process.communicate(input, timeout=timeout)
            except TimeoutExpired as exc:
                process.kill()
                if _mswindows:
                    # Windows accumulates the output in a single blocking
                    # read() call run on child threads, with the timeout
                    # being done in a join() on those threads.  communicate()
                    # _after_ kill() is required to collect that and add it
                    # to the exception.
                    exc.stdout, exc.stderr = process.communicate()
                else:
                    # POSIX _communicate already populated the output so
                    # far into the TimeoutExpired exception.
                    process.wait()
                raise
            except:  # Including KeyboardInterrupt, communicate handled that.
                process.kill()
                # We don't call process.wait() as .__exit__ does that for us.
                raise
            retcode = process.poll()
            if check and retcode:
>               raise CalledProcessError(retcode, process.args,
                                         output=stdout, stderr=stderr)
E               subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpm180jxj1', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.

/usr/lib/python3.9/subprocess.py:528: CalledProcessError

During handling of the above exception, another exception occurred:

self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>

    def test_run_example_with_setup_file(self):
      pipeline = TestPipeline(is_integration_test=True)
      coordinate_output = FileSystems.join(
          pipeline.get_option('output'),
          'juliaset-{}'.format(str(uuid.uuid4())),
          'coordinates.txt')
      extra_args = {
          'coordinate_output': coordinate_output,
          'grid_size': self.GRID_SIZE,
          'setup_file': os.path.normpath(
              os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
          'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
      }
      args = pipeline.get_full_options_as_args(**extra_args)
    
>     juliaset.run(args)

apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py:56: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/complete/juliaset/juliaset/juliaset.py:116: in run
    (
apache_beam/pipeline.py:607: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:557: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:393: in run_pipeline
    artifacts = environments.python_sdk_dependencies(options)
apache_beam/transforms/environments.py:846: in python_sdk_dependencies
    return stager.Stager.create_job_resources(
apache_beam/runners/portability/stager.py:281: in create_job_resources
    tarball_file = Stager._build_setup_package(
apache_beam/runners/portability/stager.py:796: in _build_setup_package
    processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpm180jxj1', ...],)
kwargs = {}

    def check_output(*args, **kwargs):
      if force_shell:
        kwargs['shell'] = True
      try:
        out = subprocess.check_output(*args, **kwargs)
      except OSError:
        raise RuntimeError("Executable {} not found".format(args[0]))
      except subprocess.CalledProcessError as error:
        if isinstance(args, tuple) and (args[0][2] == "pip"):
          raise RuntimeError( \
            "Full traceback: {} \n Pip install failed for package: {} \
            \n Output from execution of subprocess: {}" \
            .format(traceback.format_exc(), args[0][6], error.output))
        else:
>         raise RuntimeError("Full trace: {}, \
             output of the failed child process {} "\
            .format(traceback.format_exc(), error.output))
E         RuntimeError: Full trace: Traceback (most recent call last):
E           File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
E             out = subprocess.check_output(*args, **kwargs)
E           File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
E             return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
E           File "/usr/lib/python3.9/subprocess.py", line 528, in run
E             raise CalledProcessError(retcode, process.args,
E         subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpm180jxj1', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
E         ,            output of the failed child process b''

apache_beam/utils/processes.py:99: RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpqzib19t4/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpm180jxj1', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
=============================== warnings summary ===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
=========================== short test summary info ============================
FAILED apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::JuliaSetTestIT::test_run_example_with_setup_file - RuntimeError: Full trace: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpm180jxj1', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
,            output of the failed child process b''
====== 1 failed, 87 passed, 50 skipped, 9 warnings in 8613.43s (2:23:33) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 30m 7s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/nkusizpcxmdje

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python39 #2424

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2424/display/redirect>

Changes:


------------------------------------------
[...truncated 8.89 MB...]
thread: "Thread-13"

INFO:root:severity: INFO
timestamp {
  seconds: 1697179183
  nanos: 754646778
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697179183
  nanos: 754808425
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:275"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697179183
  nanos: 754885673
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/data_plane.py:803"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697179183
  nanos: 754952430
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:904"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697179183
  nanos: 755913734
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:287"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1697179183
  nanos: 756033897
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211"
thread: "MainThread"

fb6d4ae13a4fc6b27d5422d40a4ade82640be614e3b93c7bf9b8390ea437316e
INFO:apache_beam.runners.portability.local_job_service:Completed job in 19.794630527496338 seconds with state DONE.
INFO:root:Completed job in 19.794630527496338 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py39:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:33725
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f0ba49cb9d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f0ba49cba60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f0ba49cc1f0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.52.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temp8_xwy2__/artifacts2ih5o_vs' '--job-port' '56191' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:51 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:52 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:44689
WARNING:root:Waiting for grpc channel to be ready at localhost:56191.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:53 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:40327
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:53 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:56191
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:53 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:53 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_2652be33-cd5d-4778-ae81-3ce774301168.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:53 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_2652be33-cd5d-4778-ae81-3ce774301168.ref_Environment_default_environment_1.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:53 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_2652be33-cd5d-4778-ae81-3ce774301168.null.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:53 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_2652be33-cd5d-4778-ae81-3ce774301168.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:54 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1013063954-2c4e8ada_0f7b85e8-5a19-4361-b129-92f63ac9c3a8
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:54 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1013063954-2c4e8ada_0f7b85e8-5a19-4361-b129-92f63ac9c3a8
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:54 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:55 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.util.log: Logging initialized @9132ms to org.sparkproject.jetty.util.log.Slf4jLog
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.Server: jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 1.8.0_382-8u382-ga-1~20.04.1-b05
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.Server: Started @9278ms
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector@5ba092d5{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b2e340d{/jobs,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a553447{/jobs/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25443c41{/jobs/job,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@489b28b{/jobs/job/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1b80e06a{/stages,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7d55cbb1{/stages/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@383b8372{/stages/stage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@33cec47e{/stages/stage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ac6db6f{/stages/pool,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7f0bea4a{/stages/pool/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5c575d6{/storage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@69e0782{/storage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@771738e{/storage/rdd,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@685e370a{/storage/rdd/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6bc1b943{/environment,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@38f4c0d8{/environment/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b851438{/executors,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5e35abea{/executors/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@555996e7{/executors/threadDump,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c9adf8d{/executors/threadDump/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@405fe9{/static,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26a990d0{/,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d0ad436{/api,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6589957c{/jobs/job/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5a879c1f{/stages/stage/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@752ac6da{/metrics/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1013063954-2c4e8ada_0f7b85e8-5a19-4361-b129-92f63ac9c3a8 on Spark master local[4]
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 104857600
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:39727.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:45405.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:42987
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3
INFO:apache_beam.utils.subprocess_server:23/10/13 06:39:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1013063954-2c4e8ada_0f7b85e8-5a19-4361-b129-92f63ac9c3a8: Pipeline translated successfully. Computing outputs
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1013063954-2c4e8ada_0f7b85e8-5a19-4361-b129-92f63ac9c3a8 finished.
INFO:apache_beam.utils.subprocess_server:23/10/13 06:40:00 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@5ba092d5{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1035, in pull_responses
    for response in responses:
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 262, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    return self._next()
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:42987 {grpc_message:"recvmsg:Connection reset by peer", grpc_status:14, created_time:"2023-10-13T06:40:01.526756073+00:00"}"
>
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:39727 {created_time:"2023-10-13T06:40:01.526778434+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
Exception in thread read_grpc_client_inputs:
    raise self
Traceback (most recent call last):
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:45405 {grpc_message:"Socket closed", grpc_status:14, created_time:"2023-10-13T06:40:01.526778766+00:00"}"
>
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 669, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "recvmsg:Connection reset by peer"
	debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:42987 {grpc_message:"recvmsg:Connection reset by peer", grpc_status:14, created_time:"2023-10-13T06:40:01.526756073+00:00"}"
>

> Task :sdks:python:test-suites:portable:py39:postCommitPy39
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.

BUILD FAILED in 2h 2m 10s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/eqkbwjnyb3cfu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org