You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/10/15 14:14:40 UTC
Build failed in Jenkins: beam_PostCommit_Python39 #2433
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2433/display/redirect>
Changes:
------------------------------------------
[...truncated 9.39 MB...]
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:275"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1697373651
nanos: 424649953
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/data_plane.py:803"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1697373651
nanos: 424743652
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:904"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1697373651
nanos: 425013065
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:287"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1697373651
nanos: 425153255
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211"
thread: "MainThread"
a6da070e4b7d3379a3bc32b67a1e4c146d3a0b367795911258c68b2094f95e45
INFO:apache_beam.runners.portability.local_job_service:Completed job in 20.181049823760986 seconds with state DONE.
INFO:root:Completed job in 20.181049823760986 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
> Task :sdks:python:test-suites:portable:py39:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:32789
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f0195fe49d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f0195fe4a60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f0195fe51f0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.52.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempm53dhpv4/artifactsyocb7sg_' '--job-port' '48547' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:57 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:44029
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:34111
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:48547
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_26490f36-690c-4a73-b8b8-2246e20d614f.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_26490f36-690c-4a73-b8b8-2246e20d614f.ref_Environment_default_environment_1.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_26490f36-690c-4a73-b8b8-2246e20d614f.null.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_26490f36-690c-4a73-b8b8-2246e20d614f.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:58 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:59 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
with Pipeline() as p:
p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:59 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:40:59 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.util.log: Logging initialized @5024ms to org.sparkproject.jetty.util.log.Slf4jLog
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.Server: jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 1.8.0_382-8u382-ga-1~20.04.1-b05
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.Server: Started @5128ms
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector@67368d56{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7e728999{/jobs,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@78e909e2{/jobs/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@70fe89ce{/jobs/job,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c7e8d42{/jobs/job/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d7e1195{/stages,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@75324861{/stages/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16e64871{/stages/stage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4d2fe779{/stages/stage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1b2b6f4a{/stages/pool,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@34c1e13f{/stages/pool/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@ac55d4a{/storage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@93ccbaa{/storage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@425f61f7{/storage/rdd,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@67328ffb{/storage/rdd/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e942a39{/environment,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26db2b3a{/environment/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5309903b{/executors,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5034b133{/executors/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e5225ea{/executors/threadDump,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@a5a09c6{/executors/threadDump/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2bee815d{/static,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@286183a{/,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@571d0664{/api,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@660e75cb{/jobs/job/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:00 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2893ae7d{/stages/stage/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:01 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@9b1a25{/metrics/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:01 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:01 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90 on Spark master local[4]
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 104857600
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:37851.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37111.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36325
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90: Pipeline translated successfully. Computing outputs
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1015124058-ed1c1e5a_60a89cc6-b75b-4496-b78f-1c52335d3f90 finished.
INFO:apache_beam.utils.subprocess_server:23/10/15 12:41:04 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@67368d56{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
Exception in thread run_worker_1-1:
Traceback (most recent call last):
File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
self.run()
self.run()
File "/usr/lib/python3.9/threading.py", line 910, in run
File "/usr/lib/python3.9/threading.py", line 910, in run
self._target(*self._args, **self._kwargs)
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1035, in pull_responses
self._target(*self._args, **self._kwargs)
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 262, in run
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
for elements in elements_iterator:
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
return self._next()
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:36325 {created_time:"2023-10-15T12:41:04.625958996+00:00", grpc_status:14, grpc_message:"Socket closed"}"
>
for response in responses:
for work_request in self._control_stub.Control(get_responses()):
Exception in thread read_grpc_client_inputs:
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
return self._next()
return self._next()
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
self.run()
File "/usr/lib/python3.9/threading.py", line 910, in run
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "recvmsg:Connection reset by peer"
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37111 {grpc_message:"recvmsg:Connection reset by peer", grpc_status:14, created_time:"2023-10-15T12:41:04.625974273+00:00"}"
>
self._target(*self._args, **self._kwargs)
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 669, in <lambda>
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "recvmsg:Connection reset by peer"
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37851 {created_time:"2023-10-15T12:41:04.6259779+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
target=lambda: self._read_inputs(elements_iterator),
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
for elements in elements_iterator:
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
return self._next()
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:36325 {created_time:"2023-10-15T12:41:04.625958996+00:00", grpc_status:14, grpc_message:"Socket closed"}"
>
> Task :sdks:python:test-suites:portable:py39:postCommitPy39
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 52
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 2h 3m 8s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/cihsejzvzwfga
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Python39 #2440
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2440/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python39 #2439
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2439/display/redirect>
Changes:
------------------------------------------
[...truncated 8.19 MB...]
nanos: 21591663
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1697503251
nanos: 21735429
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:275"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1697503251
nanos: 21806240
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/data_plane.py:803"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1697503251
nanos: 21868705
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:904"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1697503251
nanos: 22056818
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker.py:287"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1697503251
nanos: 22117614
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.9/site-packages/apache_beam/runners/worker/sdk_worker_main.py:211"
thread: "MainThread"
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT
[gw4] [32mPASSED[0m apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/io/fileio_test.py::MatchIntegrationTest::test_transform_on_gcs
> Task :sdks:python:test-suites:portable:py39:portableLocalRunnerTestWithRequirementsFile
16834cd387c1488afd9e0c3aa479cddabda9bd2d99124d37dc2b74f9ae548ff2
INFO:apache_beam.runners.portability.local_job_service:Completed job in 36.58824133872986 seconds with state DONE.
INFO:root:Completed job in 36.58824133872986 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
> Task :sdks:python:test-suites:portable:py39:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:33623
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f3cdefa59d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f3cdefa5a60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f3cdefa71f0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.52.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempxr9f0wah/artifactsxlhf6ysu' '--job-port' '41823' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:44465
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:43231
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:41823
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:08 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C
WARNING:root:Waiting for grpc channel to be ready at localhost:41823.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_7b590d0f-ef95-440c-b8b7-0cc1faa1b24c.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_7b590d0f-ef95-440c-b8b7-0cc1faa1b24c.ref_Environment_default_environment_1.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_7b590d0f-ef95-440c-b8b7-0cc1faa1b24c.null.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_7b590d0f-ef95-440c-b8b7-0cc1faa1b24c.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:11 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
with Pipeline() as p:
p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:12 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:12 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.util.log: Logging initialized @7562ms to org.sparkproject.jetty.util.log.Slf4jLog
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.Server: jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 1.8.0_382-8u382-ga-1~20.04.1-b05
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.Server: Started @7665ms
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector@3c4bd5a3{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4ffecd4a{/jobs,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2bdbffaf{/jobs/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@18eb7627{/jobs/job,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3bfd4b09{/jobs/job/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4a84996b{/stages,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4831e9ed{/stages/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7ceef8a7{/stages/stage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ae34995{/stages/stage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@62663df0{/stages/pool,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e7037a9{/stages/pool/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51666e5d{/storage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5bde158a{/storage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59a617f5{/storage/rdd,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@235cf174{/storage/rdd/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7551a0a7{/environment,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@175c0565{/environment/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7bee8cfd{/executors,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@23c71438{/executors/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d97ac31{/executors/threadDump,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@48e47770{/executors/threadDump/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@70048b0a{/static,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1ce95677{/,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5183b49c{/api,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63e31a89{/jobs/job/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4f34ad13{/stages/stage/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1ba4d4e9{/metrics/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:13 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:14 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf on Spark master local[4]
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 104857600
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:45397.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:37777.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:37319
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf: Pipeline translated successfully. Computing outputs
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1017004111-760ff90d_e0c6c3c8-d970-4a2e-b62c-e2a783ec60cf finished.
INFO:apache_beam.utils.subprocess_server:23/10/17 00:41:17 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@3c4bd5a3{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
Exception in thread run_worker_1-1:
Traceback (most recent call last):
File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
self.run()
File "/usr/lib/python3.9/threading.py", line 910, in run
self._target(*self._args, **self._kwargs)
self.run()
File "/usr/lib/python3.9/threading.py", line 910, in run
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1035, in pull_responses
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
for elements in elements_iterator:
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
return self._next()
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "recvmsg:Connection reset by peer"
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37319 {created_time:"2023-10-17T00:41:17.671956048+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
self._target(*self._args, **self._kwargs)
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 262, in run
Exception in thread read_grpc_client_inputs:
for response in responses:
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
for work_request in self._control_stub.Control(get_responses()):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
self.run()
return self._next()
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
File "/usr/lib/python3.9/threading.py", line 910, in run
return self._next()
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
self._target(*self._args, **self._kwargs)
raise self
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 669, in <lambda>
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "recvmsg:Connection reset by peer"
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37777 {created_time:"2023-10-17T00:41:17.671978731+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
raise self
target=lambda: self._read_inputs(elements_iterator),
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:45397 {grpc_message:"Socket closed", grpc_status:14, created_time:"2023-10-17T00:41:17.672021915+00:00"}"
>
for elements in elements_iterator:
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 541, in __next__
return self._next()
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/grpc/_channel.py",> line 967, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "recvmsg:Connection reset by peer"
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:37319 {created_time:"2023-10-17T00:41:17.671956048+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}"
>
> Task :sdks:python:test-suites:portable:py39:postCommitPy39
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 52
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 2h 57m 50s
219 actionable tasks: 158 executed, 57 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/hz3rovvvajcsy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python39 #2438
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2438/display/redirect?page=changes>
Changes:
[noreply] Merge pull request #28656: Update Google Cloud Java Libraries BOM from
------------------------------------------
[...truncated 12.17 MB...]
[gw6] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
[gw6] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
[gw6] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
[gw6] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
=================================== FAILURES ===================================
[31m[1m_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________[0m
[gw2] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>
args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', ...],)
kwargs = {}
def check_output(*args, **kwargs):
if force_shell:
kwargs['shell'] = True
try:
> out = subprocess.check_output(*args, **kwargs)
[1m[31mapache_beam/utils/processes.py[0m:89:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31m/usr/lib/python3.9/subprocess.py[0m:424: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1
def run(*popenargs,
input=None, capture_output=False, timeout=None, check=False, **kwargs):
"""Run command with arguments and return a CompletedProcess instance.
The returned instance will have attributes args, returncode, stdout and
stderr. By default, stdout and stderr are not captured, and those attributes
will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
If check is True and the exit code was non-zero, it raises a
CalledProcessError. The CalledProcessError object will have the return code
in the returncode attribute, and output & stderr attributes if those streams
were captured.
If timeout is given, and the process takes too long, a TimeoutExpired
exception will be raised.
There is an optional argument "input", allowing you to
pass bytes or a string to the subprocess's stdin. If you use this argument
you may not also use the Popen constructor's "stdin" argument, as
it will be used internally.
By default, all communication is in bytes, and therefore any "input" should
be bytes, and the stdout and stderr will be bytes. If in text mode, any
"input" should be a string, and stdout and stderr will be strings decoded
according to locale encoding, or by "encoding" if set. Text mode is
triggered by setting any of text, encoding, errors or universal_newlines.
The other arguments are the same as for the Popen constructor.
"""
if input is not None:
if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE
if capture_output:
if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
kwargs['stderr'] = PIPE
with Popen(*popenargs, **kwargs) as process:
try:
stdout, stderr = process.communicate(input, timeout=timeout)
except TimeoutExpired as exc:
process.kill()
if _mswindows:
# Windows accumulates the output in a single blocking
# read() call run on child threads, with the timeout
# being done in a join() on those threads. communicate()
# _after_ kill() is required to collect that and add it
# to the exception.
exc.stdout, exc.stderr = process.communicate()
else:
# POSIX _communicate already populated the output so
# far into the TimeoutExpired exception.
process.wait()
raise
except: # Including KeyboardInterrupt, communicate handled that.
process.kill()
# We don't call process.wait() as .__exit__ does that for us.
raise
retcode = process.poll()
if check and retcode:
> raise CalledProcessError(retcode, process.args,
output=stdout, stderr=stderr)
[1m[31mE subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.[0m
[1m[31m/usr/lib/python3.9/subprocess.py[0m:528: CalledProcessError
[33mDuring handling of the above exception, another exception occurred:[0m
self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>
def test_run_example_with_setup_file(self):
pipeline = TestPipeline(is_integration_test=True)
coordinate_output = FileSystems.join(
pipeline.get_option('output'),
'juliaset-{}'.format(str(uuid.uuid4())),
'coordinates.txt')
extra_args = {
'coordinate_output': coordinate_output,
'grid_size': self.GRID_SIZE,
'setup_file': os.path.normpath(
os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
}
args = pipeline.get_full_options_as_args(**extra_args)
> juliaset.run(args)
[1m[31mapache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py[0m:56:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/examples/complete/juliaset/juliaset/juliaset.py[0m:116: in run
(
[1m[31mapache_beam/pipeline.py[0m:607: in __exit__
self.result = self.run()
[1m[31mapache_beam/pipeline.py[0m:557: in run
return Pipeline.from_runner_api(
[1m[31mapache_beam/pipeline.py[0m:584: in run
return self.runner.run_pipeline(self, self._options)
[1m[31mapache_beam/runners/dataflow/test_dataflow_runner.py[0m:53: in run_pipeline
self.result = super().run_pipeline(pipeline, options)
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:393: in run_pipeline
artifacts = environments.python_sdk_dependencies(options)
[1m[31mapache_beam/transforms/environments.py[0m:846: in python_sdk_dependencies
return stager.Stager.create_job_resources(
[1m[31mapache_beam/runners/portability/stager.py[0m:281: in create_job_resources
tarball_file = Stager._build_setup_package(
[1m[31mapache_beam/runners/portability/stager.py[0m:796: in _build_setup_package
processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', ...],)
kwargs = {}
def check_output(*args, **kwargs):
if force_shell:
kwargs['shell'] = True
try:
out = subprocess.check_output(*args, **kwargs)
except OSError:
raise RuntimeError("Executable {} not found".format(args[0]))
except subprocess.CalledProcessError as error:
if isinstance(args, tuple) and (args[0][2] == "pip"):
raise RuntimeError( \
"Full traceback: {} \n Pip install failed for package: {} \
\n Output from execution of subprocess: {}" \
.format(traceback.format_exc(), args[0][6], error.output))
else:
> raise RuntimeError("Full trace: {}, \
output of the failed child process {} "\
.format(traceback.format_exc(), error.output))
[1m[31mE RuntimeError: Full trace: Traceback (most recent call last):[0m
[1m[31mE File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output[0m
[1m[31mE out = subprocess.check_output(*args, **kwargs)[0m
[1m[31mE File "/usr/lib/python3.9/subprocess.py", line 424, in check_output[0m
[1m[31mE return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,[0m
[1m[31mE File "/usr/lib/python3.9/subprocess.py", line 528, in run[0m
[1m[31mE raise CalledProcessError(retcode, process.args,[0m
[1m[31mE subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.[0m
[1m[31mE , output of the failed child process b''[0m
[1m[31mapache_beam/utils/processes.py[0m:99: RuntimeError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpn0g1_z6q/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
[33m=============================== warnings summary ===============================[0m
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
[36m[1m=========================== short test summary info ============================[0m
[31mFAILED[0m apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::[1mJuliaSetTestIT::test_run_example_with_setup_file[0m - RuntimeError: Full trace: Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
out = subprocess.check_output(*args, **kwargs)
File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
File "/usr/lib/python3.9/subprocess.py", line 528, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpkf_rvzzo', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
, output of the failed child process b''
[31m====== [31m[1m1 failed[0m, [32m87 passed[0m, [33m50 skipped[0m, [33m9 warnings[0m[31m in 7894.73s (2:11:34)[0m[31m =======[0m
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 2h 18m 54s
219 actionable tasks: 159 executed, 56 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/lvmosllqebkm2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python39 #2437
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2437/display/redirect>
Changes:
------------------------------------------
[...truncated 12.07 MB...]
[gw3] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
[gw3] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
[gw3] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
[gw3] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
=================================== FAILURES ===================================
[31m[1m_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________[0m
[gw3] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>
args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', ...],)
kwargs = {}
def check_output(*args, **kwargs):
if force_shell:
kwargs['shell'] = True
try:
> out = subprocess.check_output(*args, **kwargs)
[1m[31mapache_beam/utils/processes.py[0m:89:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31m/usr/lib/python3.9/subprocess.py[0m:424: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1
def run(*popenargs,
input=None, capture_output=False, timeout=None, check=False, **kwargs):
"""Run command with arguments and return a CompletedProcess instance.
The returned instance will have attributes args, returncode, stdout and
stderr. By default, stdout and stderr are not captured, and those attributes
will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
If check is True and the exit code was non-zero, it raises a
CalledProcessError. The CalledProcessError object will have the return code
in the returncode attribute, and output & stderr attributes if those streams
were captured.
If timeout is given, and the process takes too long, a TimeoutExpired
exception will be raised.
There is an optional argument "input", allowing you to
pass bytes or a string to the subprocess's stdin. If you use this argument
you may not also use the Popen constructor's "stdin" argument, as
it will be used internally.
By default, all communication is in bytes, and therefore any "input" should
be bytes, and the stdout and stderr will be bytes. If in text mode, any
"input" should be a string, and stdout and stderr will be strings decoded
according to locale encoding, or by "encoding" if set. Text mode is
triggered by setting any of text, encoding, errors or universal_newlines.
The other arguments are the same as for the Popen constructor.
"""
if input is not None:
if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE
if capture_output:
if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
kwargs['stderr'] = PIPE
with Popen(*popenargs, **kwargs) as process:
try:
stdout, stderr = process.communicate(input, timeout=timeout)
except TimeoutExpired as exc:
process.kill()
if _mswindows:
# Windows accumulates the output in a single blocking
# read() call run on child threads, with the timeout
# being done in a join() on those threads. communicate()
# _after_ kill() is required to collect that and add it
# to the exception.
exc.stdout, exc.stderr = process.communicate()
else:
# POSIX _communicate already populated the output so
# far into the TimeoutExpired exception.
process.wait()
raise
except: # Including KeyboardInterrupt, communicate handled that.
process.kill()
# We don't call process.wait() as .__exit__ does that for us.
raise
retcode = process.poll()
if check and retcode:
> raise CalledProcessError(retcode, process.args,
output=stdout, stderr=stderr)
[1m[31mE subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.[0m
[1m[31m/usr/lib/python3.9/subprocess.py[0m:528: CalledProcessError
[33mDuring handling of the above exception, another exception occurred:[0m
self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>
def test_run_example_with_setup_file(self):
pipeline = TestPipeline(is_integration_test=True)
coordinate_output = FileSystems.join(
pipeline.get_option('output'),
'juliaset-{}'.format(str(uuid.uuid4())),
'coordinates.txt')
extra_args = {
'coordinate_output': coordinate_output,
'grid_size': self.GRID_SIZE,
'setup_file': os.path.normpath(
os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
}
args = pipeline.get_full_options_as_args(**extra_args)
> juliaset.run(args)
[1m[31mapache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py[0m:56:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/examples/complete/juliaset/juliaset/juliaset.py[0m:116: in run
(
[1m[31mapache_beam/pipeline.py[0m:607: in __exit__
self.result = self.run()
[1m[31mapache_beam/pipeline.py[0m:557: in run
return Pipeline.from_runner_api(
[1m[31mapache_beam/pipeline.py[0m:584: in run
return self.runner.run_pipeline(self, self._options)
[1m[31mapache_beam/runners/dataflow/test_dataflow_runner.py[0m:53: in run_pipeline
self.result = super().run_pipeline(pipeline, options)
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:393: in run_pipeline
artifacts = environments.python_sdk_dependencies(options)
[1m[31mapache_beam/transforms/environments.py[0m:846: in python_sdk_dependencies
return stager.Stager.create_job_resources(
[1m[31mapache_beam/runners/portability/stager.py[0m:281: in create_job_resources
tarball_file = Stager._build_setup_package(
[1m[31mapache_beam/runners/portability/stager.py[0m:796: in _build_setup_package
processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', ...],)
kwargs = {}
def check_output(*args, **kwargs):
if force_shell:
kwargs['shell'] = True
try:
out = subprocess.check_output(*args, **kwargs)
except OSError:
raise RuntimeError("Executable {} not found".format(args[0]))
except subprocess.CalledProcessError as error:
if isinstance(args, tuple) and (args[0][2] == "pip"):
raise RuntimeError( \
"Full traceback: {} \n Pip install failed for package: {} \
\n Output from execution of subprocess: {}" \
.format(traceback.format_exc(), args[0][6], error.output))
else:
> raise RuntimeError("Full trace: {}, \
output of the failed child process {} "\
.format(traceback.format_exc(), error.output))
[1m[31mE RuntimeError: Full trace: Traceback (most recent call last):[0m
[1m[31mE File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output[0m
[1m[31mE out = subprocess.check_output(*args, **kwargs)[0m
[1m[31mE File "/usr/lib/python3.9/subprocess.py", line 424, in check_output[0m
[1m[31mE return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,[0m
[1m[31mE File "/usr/lib/python3.9/subprocess.py", line 528, in run[0m
[1m[31mE raise CalledProcessError(retcode, process.args,[0m
[1m[31mE subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.[0m
[1m[31mE , output of the failed child process b''[0m
[1m[31mapache_beam/utils/processes.py[0m:99: RuntimeError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpn6ierrxt/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
[33m=============================== warnings summary ===============================[0m
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
return airline_df[at_top_airports].mean()
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
[36m[1m=========================== short test summary info ============================[0m
[31mFAILED[0m apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::[1mJuliaSetTestIT::test_run_example_with_setup_file[0m - RuntimeError: Full trace: Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
out = subprocess.check_output(*args, **kwargs)
File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
File "/usr/lib/python3.9/subprocess.py", line 528, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpz8_pabuv', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
, output of the failed child process b''
[31m====== [31m[1m1 failed[0m, [32m87 passed[0m, [33m50 skipped[0m, [33m9 warnings[0m[31m in 7609.58s (2:06:49)[0m[31m =======[0m
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 2h 13m 4s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/w3czffujkh6y4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python39 #2436
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2436/display/redirect>
Changes:
------------------------------------------
[...truncated 12.12 MB...]
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
=================================== FAILURES ===================================
[31m[1m_______________ JuliaSetTestIT.test_run_example_with_setup_file ________________[0m
[gw4] linux -- Python 3.9.10 <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>
args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', ...],)
kwargs = {}
def check_output(*args, **kwargs):
if force_shell:
kwargs['shell'] = True
try:
> out = subprocess.check_output(*args, **kwargs)
[1m[31mapache_beam/utils/processes.py[0m:89:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31m/usr/lib/python3.9/subprocess.py[0m:424: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1
def run(*popenargs,
input=None, capture_output=False, timeout=None, check=False, **kwargs):
"""Run command with arguments and return a CompletedProcess instance.
The returned instance will have attributes args, returncode, stdout and
stderr. By default, stdout and stderr are not captured, and those attributes
will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
If check is True and the exit code was non-zero, it raises a
CalledProcessError. The CalledProcessError object will have the return code
in the returncode attribute, and output & stderr attributes if those streams
were captured.
If timeout is given, and the process takes too long, a TimeoutExpired
exception will be raised.
There is an optional argument "input", allowing you to
pass bytes or a string to the subprocess's stdin. If you use this argument
you may not also use the Popen constructor's "stdin" argument, as
it will be used internally.
By default, all communication is in bytes, and therefore any "input" should
be bytes, and the stdout and stderr will be bytes. If in text mode, any
"input" should be a string, and stdout and stderr will be strings decoded
according to locale encoding, or by "encoding" if set. Text mode is
triggered by setting any of text, encoding, errors or universal_newlines.
The other arguments are the same as for the Popen constructor.
"""
if input is not None:
if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE
if capture_output:
if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
kwargs['stderr'] = PIPE
with Popen(*popenargs, **kwargs) as process:
try:
stdout, stderr = process.communicate(input, timeout=timeout)
except TimeoutExpired as exc:
process.kill()
if _mswindows:
# Windows accumulates the output in a single blocking
# read() call run on child threads, with the timeout
# being done in a join() on those threads. communicate()
# _after_ kill() is required to collect that and add it
# to the exception.
exc.stdout, exc.stderr = process.communicate()
else:
# POSIX _communicate already populated the output so
# far into the TimeoutExpired exception.
process.wait()
raise
except: # Including KeyboardInterrupt, communicate handled that.
process.kill()
# We don't call process.wait() as .__exit__ does that for us.
raise
retcode = process.poll()
if check and retcode:
> raise CalledProcessError(retcode, process.args,
output=stdout, stderr=stderr)
[1m[31mE subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.[0m
[1m[31m/usr/lib/python3.9/subprocess.py[0m:528: CalledProcessError
[33mDuring handling of the above exception, another exception occurred:[0m
self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>
def test_run_example_with_setup_file(self):
pipeline = TestPipeline(is_integration_test=True)
coordinate_output = FileSystems.join(
pipeline.get_option('output'),
'juliaset-{}'.format(str(uuid.uuid4())),
'coordinates.txt')
extra_args = {
'coordinate_output': coordinate_output,
'grid_size': self.GRID_SIZE,
'setup_file': os.path.normpath(
os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
}
args = pipeline.get_full_options_as_args(**extra_args)
> juliaset.run(args)
[1m[31mapache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py[0m:56:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/examples/complete/juliaset/juliaset/juliaset.py[0m:116: in run
(
[1m[31mapache_beam/pipeline.py[0m:607: in __exit__
self.result = self.run()
[1m[31mapache_beam/pipeline.py[0m:557: in run
return Pipeline.from_runner_api(
[1m[31mapache_beam/pipeline.py[0m:584: in run
return self.runner.run_pipeline(self, self._options)
[1m[31mapache_beam/runners/dataflow/test_dataflow_runner.py[0m:53: in run_pipeline
self.result = super().run_pipeline(pipeline, options)
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:393: in run_pipeline
artifacts = environments.python_sdk_dependencies(options)
[1m[31mapache_beam/transforms/environments.py[0m:846: in python_sdk_dependencies
return stager.Stager.create_job_resources(
[1m[31mapache_beam/runners/portability/stager.py[0m:281: in create_job_resources
tarball_file = Stager._build_setup_package(
[1m[31mapache_beam/runners/portability/stager.py[0m:796: in _build_setup_package
processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', ...],)
kwargs = {}
def check_output(*args, **kwargs):
if force_shell:
kwargs['shell'] = True
try:
out = subprocess.check_output(*args, **kwargs)
except OSError:
raise RuntimeError("Executable {} not found".format(args[0]))
except subprocess.CalledProcessError as error:
if isinstance(args, tuple) and (args[0][2] == "pip"):
raise RuntimeError( \
"Full traceback: {} \n Pip install failed for package: {} \
\n Output from execution of subprocess: {}" \
.format(traceback.format_exc(), args[0][6], error.output))
else:
> raise RuntimeError("Full trace: {}, \
output of the failed child process {} "\
.format(traceback.format_exc(), error.output))
[1m[31mE RuntimeError: Full trace: Traceback (most recent call last):[0m
[1m[31mE File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output[0m
[1m[31mE out = subprocess.check_output(*args, **kwargs)[0m
[1m[31mE File "/usr/lib/python3.9/subprocess.py", line 424, in check_output[0m
[1m[31mE return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,[0m
[1m[31mE File "/usr/lib/python3.9/subprocess.py", line 528, in run[0m
[1m[31mE raise CalledProcessError(retcode, process.args,[0m
[1m[31mE subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.[0m
[1m[31mE , output of the failed child process b''[0m
[1m[31mapache_beam/utils/processes.py[0m:99: RuntimeError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmppbe856em/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
[33m=============================== warnings summary ===============================[0m
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
[36m[1m=========================== short test summary info ============================[0m
[31mFAILED[0m apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::[1mJuliaSetTestIT::test_run_example_with_setup_file[0m - RuntimeError: Full trace: Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
out = subprocess.check_output(*args, **kwargs)
File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
File "/usr/lib/python3.9/subprocess.py", line 528, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpdgb86l7u', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
, output of the failed child process b''
[31m====== [31m[1m1 failed[0m, [32m87 passed[0m, [33m50 skipped[0m, [33m9 warnings[0m[31m in 7023.64s (1:57:03)[0m[31m =======[0m
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 2h 3m 1s
219 actionable tasks: 155 executed, 60 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/eirco74gayf5m
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python39 #2435
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2435/display/redirect>
Changes:
------------------------------------------
[...truncated 1.03 MB...]
def check_output(*args, **kwargs):
if force_shell:
kwargs['shell'] = True
try:
> out = subprocess.check_output(*args, **kwargs)
[1m[31mapache_beam/utils/processes.py[0m:89:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31m/usr/lib/python3.9/subprocess.py[0m:424: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
input = None, capture_output = False, timeout = None, check = True
popenargs = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', ...],)
kwargs = {'stdout': -1}
process = <Popen: returncode: 1 args: ['/home/jenkins/jenkins-slave/workspace/beam_Pos...>
stdout = b'', stderr = None, retcode = 1
def run(*popenargs,
input=None, capture_output=False, timeout=None, check=False, **kwargs):
"""Run command with arguments and return a CompletedProcess instance.
The returned instance will have attributes args, returncode, stdout and
stderr. By default, stdout and stderr are not captured, and those attributes
will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
If check is True and the exit code was non-zero, it raises a
CalledProcessError. The CalledProcessError object will have the return code
in the returncode attribute, and output & stderr attributes if those streams
were captured.
If timeout is given, and the process takes too long, a TimeoutExpired
exception will be raised.
There is an optional argument "input", allowing you to
pass bytes or a string to the subprocess's stdin. If you use this argument
you may not also use the Popen constructor's "stdin" argument, as
it will be used internally.
By default, all communication is in bytes, and therefore any "input" should
be bytes, and the stdout and stderr will be bytes. If in text mode, any
"input" should be a string, and stdout and stderr will be strings decoded
according to locale encoding, or by "encoding" if set. Text mode is
triggered by setting any of text, encoding, errors or universal_newlines.
The other arguments are the same as for the Popen constructor.
"""
if input is not None:
if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE
if capture_output:
if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
kwargs['stderr'] = PIPE
with Popen(*popenargs, **kwargs) as process:
try:
stdout, stderr = process.communicate(input, timeout=timeout)
except TimeoutExpired as exc:
process.kill()
if _mswindows:
# Windows accumulates the output in a single blocking
# read() call run on child threads, with the timeout
# being done in a join() on those threads. communicate()
# _after_ kill() is required to collect that and add it
# to the exception.
exc.stdout, exc.stderr = process.communicate()
else:
# POSIX _communicate already populated the output so
# far into the TimeoutExpired exception.
process.wait()
raise
except: # Including KeyboardInterrupt, communicate handled that.
process.kill()
# We don't call process.wait() as .__exit__ does that for us.
raise
retcode = process.poll()
if check and retcode:
> raise CalledProcessError(retcode, process.args,
output=stdout, stderr=stderr)
[1m[31mE subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.[0m
[1m[31m/usr/lib/python3.9/subprocess.py[0m:528: CalledProcessError
[33mDuring handling of the above exception, another exception occurred:[0m
self = <apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT testMethod=test_run_example_with_setup_file>
def test_run_example_with_setup_file(self):
pipeline = TestPipeline(is_integration_test=True)
coordinate_output = FileSystems.join(
pipeline.get_option('output'),
'juliaset-{}'.format(str(uuid.uuid4())),
'coordinates.txt')
extra_args = {
'coordinate_output': coordinate_output,
'grid_size': self.GRID_SIZE,
'setup_file': os.path.normpath(
os.path.join(os.path.dirname(__file__), '..', 'setup.py')),
'on_success_matcher': all_of(PipelineStateMatcher(PipelineState.DONE)),
}
args = pipeline.get_full_options_as_args(**extra_args)
> juliaset.run(args)
[1m[31mapache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py[0m:56:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/examples/complete/juliaset/juliaset/juliaset.py[0m:116: in run
(
[1m[31mapache_beam/pipeline.py[0m:607: in __exit__
self.result = self.run()
[1m[31mapache_beam/pipeline.py[0m:557: in run
return Pipeline.from_runner_api(
[1m[31mapache_beam/pipeline.py[0m:584: in run
return self.runner.run_pipeline(self, self._options)
[1m[31mapache_beam/runners/dataflow/test_dataflow_runner.py[0m:53: in run_pipeline
self.result = super().run_pipeline(pipeline, options)
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:393: in run_pipeline
artifacts = environments.python_sdk_dependencies(options)
[1m[31mapache_beam/transforms/environments.py[0m:846: in python_sdk_dependencies
return stager.Stager.create_job_resources(
[1m[31mapache_beam/runners/portability/stager.py[0m:281: in create_job_resources
tarball_file = Stager._build_setup_package(
[1m[31mapache_beam/runners/portability/stager.py[0m:796: in _build_setup_package
processes.check_output(build_setup_args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
args = (['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', ...],)
kwargs = {}
def check_output(*args, **kwargs):
if force_shell:
kwargs['shell'] = True
try:
out = subprocess.check_output(*args, **kwargs)
except OSError:
raise RuntimeError("Executable {} not found".format(args[0]))
except subprocess.CalledProcessError as error:
if isinstance(args, tuple) and (args[0][2] == "pip"):
raise RuntimeError( \
"Full traceback: {} \n Pip install failed for package: {} \
\n Output from execution of subprocess: {}" \
.format(traceback.format_exc(), args[0][6], error.output))
else:
> raise RuntimeError("Full trace: {}, \
output of the failed child process {} "\
.format(traceback.format_exc(), error.output))
[1m[31mE RuntimeError: Full trace: Traceback (most recent call last):[0m
[1m[31mE File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output[0m
[1m[31mE out = subprocess.check_output(*args, **kwargs)[0m
[1m[31mE File "/usr/lib/python3.9/subprocess.py", line 424, in check_output[0m
[1m[31mE return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,[0m
[1m[31mE File "/usr/lib/python3.9/subprocess.py", line 528, in run[0m
[1m[31mE raise CalledProcessError(retcode, process.args,[0m
[1m[31mE subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.[0m
[1m[31mE , output of the failed child process b''[0m
[1m[31mapache_beam/utils/processes.py[0m:99: RuntimeError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:762 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmprtu3kw54/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:795 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']>
[33m=============================== warnings summary ===============================[0m
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: The default value of numeric_only in DataFrame.mean is deprecated. In a future version, it will default to False. In addition, specifying 'numeric_only=None' is deprecated. Select only valid columns or specify the value of numeric_only to silence this warning.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml> -
[36m[1m=========================== short test summary info ============================[0m
[31mFAILED[0m apache_beam/examples/complete/juliaset/juliaset/juliaset_test_it.py::[1mJuliaSetTestIT::test_run_example_with_setup_file[0m - RuntimeError: Full trace: Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/processes.py",> line 89, in check_output
out = subprocess.check_output(*args, **kwargs)
File "/usr/lib/python3.9/subprocess.py", line 424, in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
File "/usr/lib/python3.9/subprocess.py", line 528, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'build', '--sdist', '--outdir', '/tmp/tmpqczm4o6o', '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/complete/juliaset']'> returned non-zero exit status 1.
, output of the failed child process b''
[31m====== [31m[1m1 failed[0m, [32m87 passed[0m, [33m50 skipped[0m, [33m9 warnings[0m[31m in 8585.42s (2:23:05)[0m[31m =======[0m
> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py39:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 2h 29m 15s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/jl3wplyv7aq2k
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python39 #2434
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/2434/display/redirect>
Changes:
------------------------------------------
[...truncated 1.05 MB...]
configfile: pytest.ini
plugins: timeout-2.2.0, xdist-3.3.1, requests-mock-1.11.0, hypothesis-6.87.4
timeout: 4500.0s
timeout method: signal
timeout func_only: False
created: 8/8 workers
8 workers [9 items]
scheduling tests via LoadScheduling
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_multi_batch
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_multi_batch
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_single_batch
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_multi_batch
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_datatable_single_batch
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch_large_model
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_datatable_multi_batch
[gw4] [32mPASSED[0m apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch_large_model
[gw3] [32mPASSED[0m apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch
[gw0] [32mPASSED[0m apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_datatable_single_batch
[gw6] [32mPASSED[0m apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_single_batch
[gw2] [32mPASSED[0m apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_multi_batch
[gw7] [32mPASSED[0m apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_multi_batch
[gw1] [32mPASSED[0m apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_datatable_multi_batch
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_single_batch
[gw5] [32mPASSED[0m apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_multi_batch
[gw1] [32mPASSED[0m apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_single_batch Exception ignored in: <function Booster.__del__ at 0x7fe74d4c55e0>
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f8182ed65e0>
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f8045c2d5e0>
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f257ac555e0>
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f80e869a5e0>
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7faf0582f5e0>
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f9f0603a5e0>
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
Exception ignored in: <function Booster.__del__ at 0x7f70509fe5e0>
Traceback (most recent call last):
File "<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/xgboost/core.py",> line 1752, in __del__
AttributeError: 'NoneType' object has no attribute 'XGBoosterFree'
[33m=============================== warnings summary ===============================[0m
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
../../build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py:35
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/tensorflow/python/framework/dtypes.py>:35: DeprecationWarning: ml_dtypes.float8_e4m3b11 is deprecated. Use ml_dtypes.float8_e4m3b11fnuz
from tensorflow.tsl.python.lib.core import pywrap_ml_dtypes
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
apache_beam/ml/inference/vertex_ai_inference_it_test.py:68
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/ml/inference/vertex_ai_inference_it_test.py>:68: PytestUnknownMarkWarning: Unknown pytest.mark.uses_vertex_ai - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
@pytest.mark.uses_vertex_ai
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-direct-py39.xml> -
[33m================= [32m9 passed[0m, [33m[1m10 skipped[0m, [33m[1m16 warnings[0m[33m in 21.43s[0m[33m ==================[0m
> Task :sdks:python:test-suites:direct:py39:inferencePostCommitIT
> Task :sdks:python:test-suites:direct:py39:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --numprocesses=8 --timeout=4500 --color=yes --log-cli-level=INFO apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT apache_beam/io/gcp/bigquery_io_read_it_test.py apache_beam/io/gcp/bigquery_read_it_test.py apache_beam/io/gcp/bigquery_write_it_test.py apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py
>>> collect markers:
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.9.10, pytest-7.4.2, pluggy-1.3.0
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python>
configfile: pytest.ini
plugins: timeout-2.2.0, xdist-3.3.1, requests-mock-1.11.0, hypothesis-6.87.4
timeout: 4500.0s
timeout method: signal
timeout func_only: False
created: 8/8 workers
8 workers [38 items]
scheduling tests via LoadScheduling
apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source
apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_with_attributes
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve_specifying_only_table
apache_beam/io/gcp/bigquery_io_read_it_test.py::BigqueryIOReadIT::test_bigquery_read_custom_1M_python
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_standard_sql
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
[gw4] [32mPASSED[0m apache_beam/io/gcp/bigquery_io_read_it_test.py::BigqueryIOReadIT::test_bigquery_read_custom_1M_python
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_iobase_source
[gw5] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve_specifying_only_table
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve_with_direct_read
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection
[gw5] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve_with_direct_read
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction_rows
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
[gw6] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve
[gw5] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction_rows
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
[gw4] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_iobase_source
[gw6] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve
[gw0] [32mPASSED[0m apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it
apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_data_only
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
[gw3] [32mPASSED[0m apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_standard_sql
apache_beam/io/gcp/bigquery_io_read_it_test.py::BigqueryIOReadIT::test_bigquery_read_1M_python
[gw2] [32mPASSED[0m apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types_avro
[gw5] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
[gw4] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction
[gw6] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write
[gw3] [32mPASSED[0m apache_beam/io/gcp/bigquery_io_read_it_test.py::BigqueryIOReadIT::test_bigquery_read_1M_python
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_insert_non_transient_api_call_error
[gw3] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_insert_non_transient_api_call_error
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_1
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_insert_errors_reporting
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_insert_errors_reporting
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_without_schema
[gw5] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_schema_autodetect
[gw6] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_0
[gw4] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update
[gw4] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update
[gw2] [32mPASSED[0m apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types_avro
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_new_types
[gw7] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_without_schema
apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
[gw5] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_schema_autodetect
[gw1] [32mPASSED[0m apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_with_attributes
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_legacy_sql
[gw2] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_new_types
[gw7] [32mPASSED[0m apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
[gw0] [32mPASSED[0m apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_data_only
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
[gw3] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_1
apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_2
[gw1] [32mPASSED[0m apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_legacy_sql
[gw6] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_0
[gw0] [32mPASSED[0m apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries
[gw3] [32mPASSED[0m apache_beam/io/gcp/bigquery_write_it_test.py::BigQueryWriteIntegrationTests::test_big_query_write_temp_table_append_schema_update_2
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1new/datastoreio.py>:250: UserWarning: Detected filter using positional arguments. Prefer using the 'filter' keyword argument instead.
query.add_filter('kind_name', '=', kind_name)
apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1new/datastoreio.py>:251: UserWarning: Detected filter using positional arguments. Prefer using the 'filter' keyword argument instead.
query.add_filter('timestamp', '=', latest_timestamp)
apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/google/cloud/datastore/query.py>:234: UserWarning: Detected filter using positional arguments. Prefer using the 'filter' keyword argument instead.
self.add_filter(property_name, operator, value)
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-direct-py39.xml> -
[33m================== [32m38 passed[0m, [33m[1m5 warnings[0m[33m in 143.44s (0:02:23)[0m[33m ==================[0m
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py39:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 23m 24s
211 actionable tasks: 147 executed, 60 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/47dmjq4njbl2w
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org